Whether Augmented, Mixed and Virtual Reality?

23/03/2020

XR is a term which has become more prominent in the last few years. It encapsulates virtual, augmented, and mixed reality topics. The definition of each of these has become saturated in the past decade, with companies using their own definitions for each to describe their products. The new IDTechEx Report, “Augmented, Mixed and Virtual Reality 2020-2030”, distils this range of terms and products, compares the technologies used in them, and produces a market forecast for the next decade.

The report discusses 83 different companies and 175 products in VR (virtual reality), AR (augmented reality) and MR (mixed reality) markets. This article specifically discusses the findings on the virtual reality market.

Virtual reality (VR) involves creating a simulated environment which a user can perceive as real. This is achieved by stimulating the various senses with appropriate signals. This is most commonly visual (via displays and optics) and auditory (via headphones or speakers) signals, but also increasingly involves efforts around haptic (touch) sensations. The generation of realistic virtual environments requires the generation of appropriate stimuli and systems to direct how the stimuli should change, whether automatically or due to user interaction. As such, this relies on a variety of components and systems including displays, optics, sensors, communication and processing, delivered via both hardware and associated software to generate this environment. 

There are three main groups of VR headset – PC VR, Standalone VR and Smartphone VR.  PC VR has a user interface & display worn on the body, but the computing and power are offloaded to the external computer. This is where most of the commercial hardware revenue is made today. Standalone VR is a dedicated standalone device (no tethering) with all required computing and components on board. Finally, smartphone/mobile VR uses the smartphone processor, display and sensors used to power VR experience, with only a very cheap accessory necessary to convert to VR. The report discusses the revenue split for these three sectors in full, and an example image is shown in the figure on right.

The report discusses the likelihood of a shift in the devices used by consumers, for example from a PC VR to a standalone VR headset. This is because it would provide greater freedom of movement and accessibility for different use cases. One example of a standalone VR product is the Oculus Quest device, released in 2019. This was one of the first devices to be standalone for a gaming purpose, and it has all the heat management and processing systems on the headset itself. Oculus is one of the big players in the VR market, and have a range of products, some of which are shown in the table and images below.

These headsets provide a range of experiences for the user, at different price points. After being founded in 2012, Oculus was bought by Facebook for $2.3bn in 2014, it has continued to grow and produce VR products for a range of markets. Details of the growth of the VR market are included in the report for a range of companies, and their different use cases. The overall market is expected to grow, as shown in this plot below.

The full image is available in the report

VR, AR & MR, as with nearly any technology area, must build on what has come before. The existing wave of interest, investment and progress in the space has been built on top of the technology which has been developed in other areas, for example from the smartphone. Many components in VR, AR & MR headsets, from the displays used, to the sensor integration (from IMUs, to 3D imaging and cameras, and more) to the batteries and power management, and so on, all directly built on the components which were invested so heavily in around the smartphone. This technology is heavily invested, targeting the future potential of XR headsets. This report provides a complete overview of the companies, technologies and products in augmented, virtual and mixed reality, allowing the reader to gain a deeper understanding of this exciting technology.

#PAuto @IDTechEx @IDTechExShow


Augmented and mixed reality: what is it, and where is it going?

10/03/2020

XR is a term that has become more prominent in the last few years. It encapsulates virtual, augmented, and mixed reality topics. The definition of each of these has become saturated in the past decade, with companies using their own definitions for each to describe their products. The new IDTechEx Report, “Augmented, Mixed and Virtual Reality 2020-2030”, distils this range of terms and products, compares the technologies used in them, and produces a forecast for the market next decade. This premium article discusses AR (augmented reality) and MR (mixed reality) in more detail.

The report discusses 83 different companies and 175 products in VR (virtual reality), AR (augmented reality) and MR (mixed reality) markets. This promotional article specifically discusses the findings from this report of the augmented and mixed reality markets.

Augmented Reality (AR) and Mixed Reality (MR) are two technologies which have become more prominent in the past ten years. AR is the use of computer technology to superimpose digital objects and data on top of a real-world environment. MR is similar to AR, but the digital objects interact spatially with the real-world objects, rather than being superimposed as “floating images” on top of the real-world objects. AR and MR are also closely related to VR. There is a cross-over in application and technology, as some VR headsets simulate the real space and then add in extra artificial content for the user in VR. But for this article, AR and MR products are considered those which allow the user in some way to directly see the real-world around them. The main target sectors of AR and MR appear to be in industry and enterprise markets. With high costs of individual products, there appears to be less penetration into a consumer space.

AR and MR products are being used in a variety of settings. One way they are being used is to solve a problem called “the skills gap” This describes the large portion of the skilled workforce who are expected to retire in the next ten years, leading to a loss of the knowledge and skills from this workforce. This knowledge needs to be passed on to new, unskilled, employees. Some companies propose that AR/VR technology can fill this skills gap and pass on this knowledge. This was one of the key areas discussed at some events IDTechEx analysts attended in 2019, in researching for this report.

AR use in manufacturing and remote assistance has also grown in the past 10 years, leading to some AR companies targeting primarily enterprise spaces over a consumer space. Although there have been fewer direct need or problem cases which AR can solve for a consumer market, smartphone AR can provide an excellent starting point for technology-driven generations to create, develop, and use an XR enabled smartphone for entertainment, marketing and advertising purposes. One example of smartphone AR mentioned in the report is IKEA place. This is an application where a user can put a piece of IKEA furniture in their room to compare against their current furniture. It allows users a window into how AR can be used to supplement their environment and can be used in day to day activities such as purchasing and visualising products bought from an internet marketplace.

AR and MR companies historically have typically received higher funding per round than VR – e.g. Magic Leap which has had $2.6Bn in funding since its launch in 2017, but only released a creator’s edition of its headset in 2019. AR and MR products tend to be more expensive than VR products, as they are marketed to niche use cases. These are discussed in greater detail in the report, for example the below plot which shows this tendency for AR/MR products to be more expensive than VR products.
The report compares both augmented and mixed reality products and splits them into three categories: PC AR/MR, Standalone AR/MR and Smartphone/mobile AR/MR. PC products which need a physical PC attachment, standalone products which do not require a PC, and smartphone products – those which use a smartphone’s capabilities to implement the immersive experience. Standalone AR/MR have had more distinct product types in the past decade, and this influences the decisions made when forecasting the future decade to come.

The report predicts an AR/MR market worth over $20Bn in 2030, displaying the high interest around this technology. This report also provides a complete overview of the companies, technologies and products in augmented, virtual and mixed reality, allowing the reader to gain a deeper understanding of this exciting technology.

In conclusion, VR, AR & MR, as with nearly any technology area, must build on what has come before. This technology is heavily invested, targeting the future potential of XR headsets. “Augmented, Mixed and Virtual Reality 2020-2030” provides a complete overview of the companies, technologies and products in augmented, virtual and mixed reality, allowing the reader to gain a deeper understanding of this exciting technology.


Data privacy!

28/01/2020

It’s been another busy year for hackers. According to the Central Statistics Office, nearly 1 in 5 (18 %) of Irish businesses experienced ICT-related incidents, 87 per cent of which resulted in the unavailability of ICT services, and 41% which resulted in either the destruction, corruption or disclosure of data.

Noel O’Grady, writer of this piece, is the head of Sungard Availability Services Ireland and has over 20 years of experience working with leading technology firms including HP, Vodafone and Dell in providing critical production and recovery services to enterprise-level organisations.

Last year saw a number of high-profile security incidents making the headlines. In April, 3,600 accounts belonging to former customers of Ulster Bank were compromised, resulting in some customers’ personal details being released. In July, the Football Association of Ireland confirmed that malware was discovered on its payroll server following an attempted hack on IT systems.

Entering a new decade, digital technologies will continue to permeate every aspect of modern life, and the security of IT systems will come under increasing scrutiny. This will be driven by two major consequences of today’s hyper-connected world. Firstly, the sheer number of systems and devices which have now become digitalised has vastly expanded the cybersecurity threat landscape, potentially multiplying vulnerabilities or points of entry for hackers. Simultaneously, consumers and businesses alike demand constant availability in the products and services they use, reducing the tolerance for periods of downtime.

As a result, the security of data is no less than a global issue on par with national security, economic stability and even the physical security of citizens. It is with this in mind that Data Privacy Day is observed on this day (28th January 2020), a global initiative which aims to spread awareness of the hugely fundamental role that cybersecurity plays.

One of the most important developments in the field of data privacy was the establishment of the General Data Protection Regulation (GDPR) in May 2018. Nearly two years on, it’s timely to review how the new regulatory environment has succeeded in achieving its goals, especially in the light that almost one in three European businesses are still not compliant.

Data Privacy Day 2020

GDPR works by penalising organisations with inadequate data protection through sizeable fines. While this has established an ethical framework from which European organisations can set out strategies for protecting personal data, one issue that is still often overseen is the result of an IT outage, which prevents businesses from keeping its services running. As a server or organisation’s infrastructure is down, data is then at risk to exposure and therefore a company is at risk of failing compliance. IT and business teams will need to locate and close any vulnerabilities in IT systems or business processes, and switch over to disaster recovery arrangements if they believe there has been a data corruption.

This is especially pertinent in Ireland, where, according to a spokesperson for the Department of Business, Enterprise and Innovation (DoBEI), “Data centre presence…raises our visibility internationally as a technology-rich, innovative economy.” A strategic European hub for many multi-national technology giants, Ireland is currently home to 54 data centres, with another 10 under construction and planning permission for a further 31. While this growth in Ireland’s data centre market is a huge advantage for the national economy, Irish businesses must also tread with caution as they shoulder the responsibility for the security and availability of the countless mission-critical applications and processes which rely on them.

An organisation’s speed and effectiveness of response will be greatly improved if it has at its fingertips the results of a Data Protection Impact Assessment (DPIA) that details all the personal data that an organisation collects, processes and stores, categorised by level of sensitivity. Data Privacy Day is a great opportunity to expose unknown risks that organisations face, but moving forward, it is vital that business leaders embed privacy into every operation. This is the only sustainable way to ensure compliance on an ongoing basis.

#Cybersecurity @SungardASUK @brands2life

Flood monitoring.

27/01/2020
Monitoring is an essential component of natural flooding management, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.
GreatFenGB

Great Fen showing Holme Fen woods top left and new ponds and meres in April

Flooding is a natural process, but it endangers lives and causes heavy economic loss. Furthermore, flood risk is expected to increase with climate change and increased urbanisation, so a heavy responsibility lies with those that allocate funding and formulate flood management strategy. In the following article, Nigel Grimsley from OTT Hydromet explains how the success of such plans (both the design and implementation) depend on the accuracy and reliability of the monitoring data upon which they rely.

Climate projections for Britain suggest that rainfall will increase in winter and decrease in summer, and that individual rainfall events may increase in intensity, especially in winter. This paradigm predicates an increased risk of flooding.

Emphasising the urgent need for action on flood risk, (British) Environment Agency chairwoman Emma Howard Boyd, has said that on current trends, global temperature could rise between 2 deg C and 4 Deg C by 2100 and some communities may even need to move because of the risk of floods. Launching a consultation on the agency’s flood strategy, she said: “We can’t win a war against water by building away climate change with infinitely high flood defences.”

In response, Mike Childs, head of science at Friends of the Earth, said: “Smarter adaptation and resilience building – including natural flood management measures like tree-planting – is undeniably important but the focus must first and foremost be on slashing emissions so that we can avoid the worst consequences of climate chaos in the first place.”

Historically, floodplains have been exploited for agricultural and urban development, which has increased the exposure of people, property and other infrastructure to floods. Flood risk management therefore focused on measures to protect communities and industry in affected areas. However, flood risk is now addressed on a wider catchment scale so that initiatives in one part of a catchment do not have negative effects further downstream. This catchment based approach is embodied within the EU Floods Directive 2007/60/EC, and in recent years, those responsible for flood management have increasingly looked for solutions that employ techniques which work with natural hydrological and morphological processes, features and characteristics to manage the sources and pathways of flood waters. These techniques are known as natural flood management (NFM) and include the restoration, enhancement and alteration of natural features but exclude traditional flood defence engineering that effectively disrupts these natural processes.

NFM seeks to create efficiency and sustainability in the way the environment is managed by recognising that when land and water are managed together at the catchment scale it is possible to generate whole catchment improvements with multiple benefits.

Almost all NFM techniques aim to slow the flow of water and whilst closely connected, can be broadly categorised as infiltration, conveyance and storage.

Infiltration
Land use changes such as set-aside, switching arable to grassland or restricted hillside cropping, can improve infiltration and increase water retention. In addition, direct drilling, ‘no-till’ techniques and cross slope ploughing can have a similar effect. These land use techniques are designed to reduce the soil compaction which increases run-off. Livestock practices such as lower stocking rates and shorter grazing seasons can also help. Field drainage can be designed to increase storage and reduce impermeability, which is also aided by low ground pressure vehicles. The planting of shrubs and trees also helps infiltration and retention by generating a demand for soil moisture, so that soils have a greater capacity to absorb water. Plants also help to bind soil particles, resulting in less erosion – the cause of fertility loss and sedimentation in streams and rivers.

Conveyance
Ditches and moorland grips can be blocked to reduce conveyance, and river profiles can be restored to slow the flow. In the past, peats and bogs have been drained to increase cropping areas, but this damages peatlands and reduces their capacity to retain water and store carbon. The restoration of peatland therefore relies on techniques to restore moisture levels. Pumping and drainage regimes can be modified, and landowners can create strategically positioned hedges, shelter belts and buffer strips to reduce water conveyance.

Storage
Rivers can be reconnected with restored floodplains and river re-profiling, leaky dams, channel works and riparian improvements can all contribute to improved storage capability. In urban areas permeable surfaces and underground storage can be implemented, and washlands and retention ponds can be created in all areas. As mentioned above, the re-wetting of peatland and bogs helps to increase storage capacity.

Many of the effects of NFM might be achieved with the re-introduction of beavers, which build dams that reduce peak flows, create pools and saturate soil above their dams. The dams also help to remove pollutants such as phosphates. Beavers do not eat fish, instead preferring aquatic plants, grasses and shrubs during the summer and woody plants in winter. Beavers are now being introduced in a number of areas in trials to determine their value in the implementation of NFM. One of the key benefits offered by beavers is their ability to quickly repair and rebuild dams that are damaged during extreme weather. However, whilst the potential benefits of beavers are well known, several groups have expressed concern with the prospect of their widespread introduction. For example, farmers and landowners may find increased areas of waterlogged land due to blocked drainage channels. In addition, dams present a threat to migratory fish such as salmon and sea trout.

Beavers are native to Britain and used to be widespread, but they were hunted to extinction during the 17th century. However, other non-native species such as signal crayfish can have a detrimental effect on flood protection because they burrow into river banks causing erosion, bank collapse and sediment pollution. Signal crayfish are bigger, grow faster, reproduce more quickly and tolerate a wider range of conditions than the native white-clawed crayfish. Signal crayfish are also voracious predators, feeding on fish, frogs, invertebrates and plants, and as such can create significant negative ecological effects.

NFM benefits
NFM provides protection for smaller flood events, reduces peak flooding and delays the arrival of the flood peak downstream. However, it does not mitigate the risk from extreme flood events. Effective flood management strategy therefore tends to combine NFM with hard engineering measures. Nevertheless, NFM generally provides a broader spectrum of other benefits.

The creation of new woodlands and wetlands produces biodiverse habitats with greater flood storage capacity. They also enable more species to move between habitats. NFM measures that reduce soil erosion, run-off and sedimentation also help to improve water quality and thereby also improve habitats. In particular, these measures lower nutrient and sediment loading lower in the catchment; two issues which can have dramatic effects on water quality and amenity.

Land use and land management measures help to reduce the loss of topsoil and nutrients. This improves agricultural productivity and lowers the cost of fertilizers. Furthermore, a wide range of grants are available for NFM measures, such as the creation of green spaces and floodplains, to make them more financially attractive to farmers and landowners.

Many NFM measures help in the fight against climate change. For example, wetlands and woodlands are effective at storing carbon and removing carbon dioxide from the atmosphere. Measures that reduce surface run off and soil erosion, such as contour cultivation, can also reduce carbon loss from soil.

Monitoring NFM
Given the wide range of potential NFM benefits outlined above, the number and type of parameters to be monitored are likely to be equally diverse. Baseline data is essential if the impacts of implemented measures are to be assessed, but this may not always be deliverable. For example, it may only be possible to collect one season of data prior to a five year project. However, it may be possible to secure baseline data from other parties. In all instances data should of course be accurate, reliable, relevant and comparable.

Monitoring data should be used to inform the design of NFMs. For example, a detailed understanding of the ecology, geomorphology, hydrology and meteorology of the entire catchment will help to ensure that the correct measures are chosen. These measures should be selected in partnership with all stakeholders, and ongoing monitoring should provide visibility of the effects of NFM measures. Typically stakeholders will include funders, project partners, local communities, landowners, regulators and local authorities.

Since NFM measures are designed to benefit an entire catchment, it is important that monitoring is also catchment-wide. However, this is likely to be a large area so there will be financial implications, particularly for work that is labour-intensive. Consequently, it will be necessary to prioritise monitoring tasks and to deploy remote, automatic technology wherever it is cost-effective.

OTT ecoN with wiper

OTT ecoN Sensor

Clearly, key parameters such as rainfall, groundwater level, river level and surface water quality should be monitored continuously in multiple locations if the benefits of NFM are to be measured effectively. It is fortunate therefore that all of these measurements can be taken continuously 24/7 by instruments that can be left to monitor in remote locations without a requirement for frequent visits to calibrate, service or change power supplies. As a business OTT Hydromet has been focused on the development of this capability for many years, developing sensors that are sufficiently rugged to operate in potentially aggressive environments, data loggers with enormous capacity but with very low power requirement, and advanced communications technologies so that field data can be instantly viewed by all stakeholders.

Recent developments in data management have led to the development of web-enabled data management solutions such as Hydromet Cloud, which, via a website and App, delivers the backend infrastructure to receive, decode, process, display and store measurement data from nearly any remote hydromet monitoring station or sensor via a cloud-based data hosting platform. As a consequence, alarms can be raised automatically, which facilitates integration with hard engineering flood control measures. Hydromet Cloud also provides access to both current and historic measurement data, enabling stakeholders to view the status of an entire catchment on one screen.

Holme Fen – a monitoring lesson from the 1850s

Holme Fen post HS

Holme Fen post

Surrounded by prime agricultural land to the south of Peterborough (Cambridgeshire,GB) , the fens originally contained many shallow lakes, of which Whittlesey Mere was the largest, covering around 750 hectares in the summer and around twice that in the winter. Fed by the River Nene, the mere was very shallow and was the last of the ‘great meres’ to be drained and thereby converted to cultivatable land.

Led by William Wells, a group of local landowners funded and arranged the drainage project, which involved the development of a newly invented steam powered centrifugal pump which was capable of raising over 100 tons of water per minute by 2 or 3 feet. A new main drain was constructed to take water to the Wash. Conscious of the likely shrinking effect of drainage on the peaty soil, Wells instigated the burial of a measurement post, which was anchored in the Oxford Clay bedrock and cut off at the soil surface. In 1851 the original timber post was replaced by a cast iron column which is believed to have come from the Crystal Palace in London.

By installing a measurement post, Wells demonstrated remarkable foresight. As the drainage proceeded, the ground level sank considerably; by 1.44 metres in the first 12 years, and by about 3 metres in the first 40 years. Today, around 4 metres of the post is showing above ground, recording the ground subsidence since 1852. The ground level at Holme Post is now 2.75 metres below sea level – the lowest land point in Great Britain.
Several complications have arisen as a result of the drainage. Firstly, there has been a huge impact in local ecology and biodiversity with the loss of a large area of wetland. Also, as the ground level subsided it became less sustainable to pump water up into the main drain.

Holme Fen is now a National Nature Reserve, managed by Natural England, as is the nearby Woodwalton Fen. They are both part of the Great Fen Project, an exciting habitat restoration project, involving several partners, including the local Wildlife Trust, Natural England and the Environment Agency. At Woodwalton, the more frequent extreme weather events that occur because of climate change result in flooding that spills into the reserve. In the past, this was a good example of NFM as the reserve provided a buffer for excess floodwater. However, Great Fen Monitoring and Research Officer Henry Stanier says: “Floodwater increasingly contains high levels of nutrients and silt which can harm the reserve’s ecology, so a holistic, future-proof strategy for the area is necessary.”

Applauding the farsightedness of William Wells, Henry says: “As a conservationist I am often called in to set up monitoring after ecological recovery has begun, rather than during or even before harm has taken place. At the Wildlife Trust, we are therefore following the example provided by Wells, and have a network of monitoring wells in place so that we can monitor the effects of any future changes in land management.

“For example, we are setting up a grant funded project to identify the most appropriate crops for this area; now and in the future, and we are working with OTT to develop a monitoring strategy that will integrate well monitoring with the measurement of nutrients such as phosphate and nitrate in surface waters.”

Summary
Monitoring provides an opportunity to measure the effects of initiatives and mitigation measures. It also enables the identification of trends so that timely measures can be undertaken before challenges become problems, and problems become catastrophes.

Monitoring is an essential component of NFM, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.

#Environment @OTTHydromet @EnvAgency @friends_earth


It all began with the War of the Currents…

24/01/2020

Today, people greatly appreciate having electrical energy available at the flip of a switch, seemingly at any time and for any occasion. But where does electricity actually come from? The answer most people would give you is: “from the wall socket, of course”. So does this automatically settle the question of security of supply? More on this later.

If we compare the history of electric current with the 75 years of the history of Camille Bauer Metrawatt AG, it is easy to see how they were interlinked at certain times in the course of their development. Why is that?

It all began with the War of the Currents – an economic dispute about a technical standard

It was around 1890 when the so-called War of the Currents started in the USA. At that time, the question was whether the direct current favoured by Thomas Alva Edison (1847-1931) or the alternating current promoted by Nicola Tesla (1856-1943) and financially supported by George Westinghouse (1846-1914), was the more suitable technology for supplying the United States of America with electrical energy over large areas and constructing power grids. Because of Westinghouse’s market dominance at that time compared to Edison General Electric (called General Electric from 1890 on), it soon became clear that the alternating voltage invented by Nicola Tesla was rapidly gaining the upper hand. This was not least because its approximately 25% lower transmission losses weighed unquestionably in its favour. Soon afterward, came the breakthrough for alternating voltage as the means of transmitting electrical energy using. Initially, the main target application was electric lighting, which to be spurred on by the invention of the incandescent lamp by Edison. The reasons for this were logical. Westinghouse was initially a lighting manufacturing company and wanted to secure as great a market share as possible.

As developments continued, it is no surprise that already by 1891, in Germany for example, the first long-distance transmission of electrical energy was put into operation, over a distance of more than 170 km from Lauffen am Neckar to Frankfurt am Main. It was a technological breakthrough using three-phase current technology. However, this has by no means been the end of the story for direct current. Not least because of digitalization, electromobility, decentralized energy supplies, etc., DC voltage has experienced a full-blown renaissance and now is treated almost as a brand-new topic.

The Camille Bauer story.
The foundation of the Camille Bauer company dates back to 1900, immediately after the War of the Currents just described, at a time when electricity was rapidly gaining in importance. At the turn of the century, the Camille Bauer company, named after its founder Camille Bauer-Judlin, began importing measuring instruments for the trendy new phenomenon called “electricity” into Switzerland for sale to the local market. Some years later, in 1906, Dr. Siegfried Guggenheimer (1875 – 1938), formerly a research scientist for Wilhelm Conrad Röntgen (1845 – 1923) and who in 1901, became the first winner of the Nobel Prize for physics, founded what was a start-up company in Nuremberg, Germany, trading under his own name. The company was engaged in the production and sale of electrical measuring instruments. However, due to pressure from the Nazis because Dr. Guggenheimer was of Jewish descent, he had to rename the company in 1933, creating Metrawatt AG.

Four technological segments.

Four technological segments.

In 1919, a man by the name of Paul Gossen entered the picture. He was so dissatisfied with his employment with Dr. Guggenheimer that he founded his own company in Erlangen, near Nuremberg, and for decades the two rivals were continuously in fierce competition with one another. In 1944, towards the end of the Second World War, Camille Bauer could see that its importing business had virtually come to a standstill. All the factories of its suppliers, which were mainly in Germany (for example Hartmann & Braun, Voigt & Haeffner, Lahmeyer, etc.), had been converted to supplying materials for the war. At this point, a decision had to be made quickly. Camille Bauer’s original trading company located in Basel (CH), undertook a courageous transformation. In order to survive, it turned itself into a manufacturing company. In a first step, the recently formed manufacturing company Matter, Patocchi & Co. AG in Wohlen (CH) was taken over, in order to be get the business up and running quickly with the necessary operating resources at their disposal. Thus the Swiss manufacturing base in Wohlen in the canton of Aargau was born.

The story does not end there. In 1979, Camille Bauer was taken over by Röchling a family-owned company in Mannheim, Germany. At that time, Röchling wanted to quit the iron and steel business and enter the field of I&C technology. Later, in 1993, Gossen in Erlangen and Metrawatt in Nuremberg were reunited in a single company, after Röchling became owner of the Gossen holding company as a result of the acquisition of the Bergmann Group from Siemens in 1989, and Metrawatt was acquired from ABB in 1992. At the same time, Camille Bauer’s German sales operation in Frankfurt-Dreieich also became a part of the company. Today the companies operate globally and successfully under the umbrella brand of GMC-I (Gossen Metrawatt Camille-Bauer-Instruments).

A new era.
The physics of electric current have not changed over the course of time. However, business conditions have changed drastically, especially over the last 5-10 years. Catch phrases such as electricity free market, collective self-consumption, renewable energy sources, PV, wind power, climate targets, reduction of CO2 emissions, e-mobility, battery storage, Tesla, smart meters, digitalization, cyber security, network quality, etc. are all areas of interest for both people and companies. And last but not least, with today’s protest demonstrations, climate change has become a political issue. We will have to see what results from this. At the very least, the catch phrases mentioned above are perfect for developing scenarios for electricity supply security. And it really is the case that the traditional electricity infrastructure, which is often as old as Camille Bauer Metrawatt itself, was not designed for the new types of energy behaviour, either those on the consumer side or the decentralised feed-in side. As a result, it is ever more important to have increasing numbers of intelligent systems which need to work from basic data obtained from precise measurements in order to avoid outages, blackouts and resulting damage.

The overall diversity of these new clusters of topics has prompted Camille Bauer Metrawatt AG to once more face the challenges with courage and above all to do so in an innovative and productive way. In this spirit, Camille Bauer Metrawatt AG develops, produces and distributes its product range globally in 4 technological segments.

These are:
(1) Measurement & Display,
(2) Power Quality,
(3) Control & Monitoring,
(4) Software, Systems and Solutions.

Through its expert staff, modern tools and external partners Camille Bauer Metrawatt is able, for example, to analyse power quality and detect power quality problems. In addition, the Camille Bauer Metrawatt Academy, recently founded in 2019, puts its focus on knowledge transfer by experienced lecturers, with the latest and most important topics as its main priority. Furthermore, we keep in very close contact with customers, authorities, associations, specialist committees, educational institutions, practice-oriented experts and the scientific community in order to continually provide the requisite solutions to the market and interested parties.

#Camille_Bauer_Metrawatt #PAuto @irishpwrprocess


The most viewed Stories in 2019.

02/01/2020
  • MCAA President Teresa Sebring has certified the election of officers and directors of the Measurement, Control & Automation Association…
  • The VP869 high performance 6U OpenVPX FPGA processing board has been announced by Abaco Systems .  Featuring two Xilinx® UltraScale+™ FPGAs …
  • Mr. Uwe Gräff has been appointed to the Board of New Technologies & Quality at the Harting Technology Group. He follows Dr. Frank Brode…
  • ISA demonstrates its undoubted strength again in providing stunning seminars allied with top class training built on member experience. Ne…
  • GE-IP Third Annual Executive Forum on delivering operational excellence. GE Intelligent Platforms recently hosted its third annual execut…
  • Leading monitoring and analysis solution makes improving SQL Server performance easier than ever SolutionsPT has announced a new partners…
  • The International Society of Automation (ISA) has welcomed Paul Gruhn , PE, CFSE, and ISA Life Fellow, is its 2019 Society President. Pa…
  • exida, LLC announced that it has assessed the Honeywell ISA100 Wireless™ Device Manager model WDMY version R320 and certified that it meets…
  • Anglia has continued to make big strides towards full RoHS 3* compliance over six months before the deadline for meeting the new provisions…
  • The emergence of radar has been an important advance in the level measurement field. Radar represents a cost effective, accurate solution that is immune to density and other process fluid changes….

#PAuto #TandM


Most viewed stories in 2018

What is on the list of trends for 2020?

06/12/2019
Data centre trends for 2020 from Rittal

Growing volumes of data, a secure European cloud (data control), rapid upgrades of data centres and rising energy consumption are the IT/data centre trends for Rittal in 2020. For example, the use of OCP (Open Compute Project) technology and heat recovery offers solutions for the challenges of the present.

Concept of cloud computing or big data, shape of cloud in futuristic style with digital technology interface!

According to the market researchers at IDC (International Data Corporation), humans and machines could already be generating 175 zettabytes of data by 2025. If this amount of data were stored on conventional DVDs, it would mean 23 stacks of data discs, each of them reaching up to the moon. The mean 27 percent annual rate of data growth is also placing increasing pressure on the IT infrastructure.

Since there is hardly any company that can afford to increase its own data storage by almost a third every year, IT managers are increasingly relying on IT services from the cloud. The trend towards the cloud has long since been a feature in Germany: A survey published in the summer of 2019 by the Bitkom ICT industry association together with KPMG showed that three out of four companies are already using cloud solutions.

However, businesses using cloud solutions from third-party providers do lose some control over their corporate data. That is why, for example, the US Cloud Act (Clarifying Lawful Overseas Use of Data) allows US authorities to access data stored in the cloud, even if local laws at the location where the data is stored do prohibit this.

“Future success in business will be sustainable if they keep pace with full digital transformation and integration. Companies will use their data more and more to provide added value – increasingly in real time – for example in the production environment,” says Dr Karl-Ulrich Köhler, CEO of Rittal International. “Retaining control over data is becoming a critical success factor for international competitiveness,” he adds.

Trend #1: Data control
The self-determined handling of data is thus becoming a key competitive factor for companies. This applies to every industry in which data security is a top priority and where the analysis of this data is decisive for business success. Examples are the healthcare, mobility, banking or manufacturing industries. Companies are now faced with the questions of how to process their data securely and efficiently, and whether to modernise their own data centre, invest in edge infrastructures or use the cloud.

The major European “Gaia-X” digital project, an initiative of the German Federal Ministry for Economics and Energy (BMWi), is set to start in 2020. The aim is to develop a European cloud for the secure digitalization and networking of industry that will also form the basis for using new artificial intelligence (AI) applications. The Fraunhofer Gesellschaft has drawn up the “International Data Spaces” initiative in this context. This virtual data room allows companies to exchange data securely. The compatibility of their own solutions with established (cloud) platforms (interoperability) is also provided.

This means that geographically widespread, smaller data centres with open cloud stacks might be able to create a new class of industrial applications that perform initial data analysis at the point where the data is created and use the cloud for downstream analysis. One solution in this context is ONCITE. This turnkey (plug-and-produce) edge cloud data centre stores and processes data directly where it arises, enabling companies to retain control over their data when networking along the entire supply chain.

Trend #2: Standardisation in data centres with OCP
The rapid upgrade of existing data centres is becoming increasingly important for companies, as the volume of data needing to be processed continues to grow. Essential requirements for this growth are standardised technology, cost-efficient operation and a high level of infrastructure scalability. The OCP technology (Open Compute Project) with its central direct current distribution in the IT rack is becoming an interesting alternative for more and more CIOs. This is because DC components open up new potentials for cost optimisation. For instance, all the IT components can be powered centrally with n+1 power supplies per rack. This way, an efficient cooling is achieved, since fewer power packs are present. At the same time, the high degree of standardisation of OCP components simplifies both maintenance and spare parts management. The mean efficiency gain is around five percent of the total current.

Rittal expects that OCP will establish itself further in the data centre as an integrated system platform in 2020. New OCP products for rack cooling, power supply or monitoring will enable rapid expansion with DC components. Furthermore, new products will support the conventional concept of a central emergency power supply, where the power supply is safeguarded by a central UPS. As a result, it will no longer be necessary to protect every single OCP rack with a UPS based on lithium-ion batteries. The advantage: the fire load in the OCP data centre is reduced considerably.

Trend #3: Heat recovery and direct CPU cooling
Data centres release huge amounts of energy into the environment in the form of waste heat. As the power density in the data centre grows, so too do the amounts of heat, which can then potentially be used for other purposes. So far, however, the use of waste heat has proven too expensive, because consumers are rarely found in the direct vicinity of the site for example. In addition, waste heat, as generated by air-based IT cooling systems, is clearly too low at a temperature of 40 degrees Celsius to be used economically.

In the area of high-performance computing (HPC) in particular, IT racks generate high thermal loads, often in excess of 50 kW. For HPC, direct processor cooling with water is significantly more efficient than air cooling, so that return temperatures of 60 to 65 degrees become available. At these temperatures, for instance, it is possible to heat domestic hot water or use heat pumps or to feed heat into a district heating network. However, CIOs should be aware that only about 80 percent of the waste heat can be drawn from an IT rack, even with a direct CPU water cooling. IT cooling is still needed by the rack for the remaining 20 percent.

At the German Government’s 2019 Digital Summit, the topic of heat recovery was discussed in the working group concerned, which identified a high need for action. For this reason, Rittal assumes that by 2020, significantly more CIOs will be involved in the issue of how the previously unused waste heat from the data centre can be used economically.

Trend #4: Integration of multi-cloud environments
Businesses need to be assured that they can run their cloud applications on commonly used platforms and in any country. This calls for a multi-cloud strategy. From management’s point of view, this is a strategic decision based on the knowledge that its own organisation will develop into a fully digitised business.

For example, an excellent user experience is guaranteed by minimising delays with the appropriate availability zones on site. This means that companies choose one or more availability zones worldwide for their services, depending on their business requirements. Strict data protection requirements are met by a specialised local provider in the target market concerned, for example. A vendor-open multi-cloud strategy allows exactly that: combining the functional density and scalability of hyperscalers with the data security of local and specialised providers such as Innovo Cloud. At the push of a button, on a dashboard, with a contact person, an invoice and in the second when the business decision is made – this is what is making multi-cloud strategies one of the megatrends of the coming years. This is because the economy will take further steps towards digital transformation and further accelerate its own continuous integration (CI) and continuous delivery (CD) pipelines with cloud-native technologies – applications designed and developed for the cloud computing architecture. Automating the integration and delivery processes will then enable the rapid, reliable and repeatable deployment of software.

#PAuto @Rittal @EURACTIV @PresseBox