Whether Augmented, Mixed and Virtual Reality?

23/03/2020

XR is a term which has become more prominent in the last few years. It encapsulates virtual, augmented, and mixed reality topics. The definition of each of these has become saturated in the past decade, with companies using their own definitions for each to describe their products. The new IDTechEx Report, “Augmented, Mixed and Virtual Reality 2020-2030”, distils this range of terms and products, compares the technologies used in them, and produces a market forecast for the next decade.

The report discusses 83 different companies and 175 products in VR (virtual reality), AR (augmented reality) and MR (mixed reality) markets. This article specifically discusses the findings on the virtual reality market.

Virtual reality (VR) involves creating a simulated environment which a user can perceive as real. This is achieved by stimulating the various senses with appropriate signals. This is most commonly visual (via displays and optics) and auditory (via headphones or speakers) signals, but also increasingly involves efforts around haptic (touch) sensations. The generation of realistic virtual environments requires the generation of appropriate stimuli and systems to direct how the stimuli should change, whether automatically or due to user interaction. As such, this relies on a variety of components and systems including displays, optics, sensors, communication and processing, delivered via both hardware and associated software to generate this environment. 

There are three main groups of VR headset – PC VR, Standalone VR and Smartphone VR.  PC VR has a user interface & display worn on the body, but the computing and power are offloaded to the external computer. This is where most of the commercial hardware revenue is made today. Standalone VR is a dedicated standalone device (no tethering) with all required computing and components on board. Finally, smartphone/mobile VR uses the smartphone processor, display and sensors used to power VR experience, with only a very cheap accessory necessary to convert to VR. The report discusses the revenue split for these three sectors in full, and an example image is shown in the figure on right.

The report discusses the likelihood of a shift in the devices used by consumers, for example from a PC VR to a standalone VR headset. This is because it would provide greater freedom of movement and accessibility for different use cases. One example of a standalone VR product is the Oculus Quest device, released in 2019. This was one of the first devices to be standalone for a gaming purpose, and it has all the heat management and processing systems on the headset itself. Oculus is one of the big players in the VR market, and have a range of products, some of which are shown in the table and images below.

These headsets provide a range of experiences for the user, at different price points. After being founded in 2012, Oculus was bought by Facebook for $2.3bn in 2014, it has continued to grow and produce VR products for a range of markets. Details of the growth of the VR market are included in the report for a range of companies, and their different use cases. The overall market is expected to grow, as shown in this plot below.

The full image is available in the report

VR, AR & MR, as with nearly any technology area, must build on what has come before. The existing wave of interest, investment and progress in the space has been built on top of the technology which has been developed in other areas, for example from the smartphone. Many components in VR, AR & MR headsets, from the displays used, to the sensor integration (from IMUs, to 3D imaging and cameras, and more) to the batteries and power management, and so on, all directly built on the components which were invested so heavily in around the smartphone. This technology is heavily invested, targeting the future potential of XR headsets. This report provides a complete overview of the companies, technologies and products in augmented, virtual and mixed reality, allowing the reader to gain a deeper understanding of this exciting technology.

#PAuto @IDTechEx @IDTechExShow


It all began with the War of the Currents…

24/01/2020

Today, people greatly appreciate having electrical energy available at the flip of a switch, seemingly at any time and for any occasion. But where does electricity actually come from? The answer most people would give you is: “from the wall socket, of course”. So does this automatically settle the question of security of supply? More on this later.

If we compare the history of electric current with the 75 years of the history of Camille Bauer Metrawatt AG, it is easy to see how they were interlinked at certain times in the course of their development. Why is that?

It all began with the War of the Currents – an economic dispute about a technical standard

It was around 1890 when the so-called War of the Currents started in the USA. At that time, the question was whether the direct current favoured by Thomas Alva Edison (1847-1931) or the alternating current promoted by Nicola Tesla (1856-1943) and financially supported by George Westinghouse (1846-1914), was the more suitable technology for supplying the United States of America with electrical energy over large areas and constructing power grids. Because of Westinghouse’s market dominance at that time compared to Edison General Electric (called General Electric from 1890 on), it soon became clear that the alternating voltage invented by Nicola Tesla was rapidly gaining the upper hand. This was not least because its approximately 25% lower transmission losses weighed unquestionably in its favour. Soon afterward, came the breakthrough for alternating voltage as the means of transmitting electrical energy using. Initially, the main target application was electric lighting, which to be spurred on by the invention of the incandescent lamp by Edison. The reasons for this were logical. Westinghouse was initially a lighting manufacturing company and wanted to secure as great a market share as possible.

As developments continued, it is no surprise that already by 1891, in Germany for example, the first long-distance transmission of electrical energy was put into operation, over a distance of more than 170 km from Lauffen am Neckar to Frankfurt am Main. It was a technological breakthrough using three-phase current technology. However, this has by no means been the end of the story for direct current. Not least because of digitalization, electromobility, decentralized energy supplies, etc., DC voltage has experienced a full-blown renaissance and now is treated almost as a brand-new topic.

The Camille Bauer story.
The foundation of the Camille Bauer company dates back to 1900, immediately after the War of the Currents just described, at a time when electricity was rapidly gaining in importance. At the turn of the century, the Camille Bauer company, named after its founder Camille Bauer-Judlin, began importing measuring instruments for the trendy new phenomenon called “electricity” into Switzerland for sale to the local market. Some years later, in 1906, Dr. Siegfried Guggenheimer (1875 – 1938), formerly a research scientist for Wilhelm Conrad Röntgen (1845 – 1923) and who in 1901, became the first winner of the Nobel Prize for physics, founded what was a start-up company in Nuremberg, Germany, trading under his own name. The company was engaged in the production and sale of electrical measuring instruments. However, due to pressure from the Nazis because Dr. Guggenheimer was of Jewish descent, he had to rename the company in 1933, creating Metrawatt AG.

Four technological segments.

Four technological segments.

In 1919, a man by the name of Paul Gossen entered the picture. He was so dissatisfied with his employment with Dr. Guggenheimer that he founded his own company in Erlangen, near Nuremberg, and for decades the two rivals were continuously in fierce competition with one another. In 1944, towards the end of the Second World War, Camille Bauer could see that its importing business had virtually come to a standstill. All the factories of its suppliers, which were mainly in Germany (for example Hartmann & Braun, Voigt & Haeffner, Lahmeyer, etc.), had been converted to supplying materials for the war. At this point, a decision had to be made quickly. Camille Bauer’s original trading company located in Basel (CH), undertook a courageous transformation. In order to survive, it turned itself into a manufacturing company. In a first step, the recently formed manufacturing company Matter, Patocchi & Co. AG in Wohlen (CH) was taken over, in order to be get the business up and running quickly with the necessary operating resources at their disposal. Thus the Swiss manufacturing base in Wohlen in the canton of Aargau was born.

The story does not end there. In 1979, Camille Bauer was taken over by Röchling a family-owned company in Mannheim, Germany. At that time, Röchling wanted to quit the iron and steel business and enter the field of I&C technology. Later, in 1993, Gossen in Erlangen and Metrawatt in Nuremberg were reunited in a single company, after Röchling became owner of the Gossen holding company as a result of the acquisition of the Bergmann Group from Siemens in 1989, and Metrawatt was acquired from ABB in 1992. At the same time, Camille Bauer’s German sales operation in Frankfurt-Dreieich also became a part of the company. Today the companies operate globally and successfully under the umbrella brand of GMC-I (Gossen Metrawatt Camille-Bauer-Instruments).

A new era.
The physics of electric current have not changed over the course of time. However, business conditions have changed drastically, especially over the last 5-10 years. Catch phrases such as electricity free market, collective self-consumption, renewable energy sources, PV, wind power, climate targets, reduction of CO2 emissions, e-mobility, battery storage, Tesla, smart meters, digitalization, cyber security, network quality, etc. are all areas of interest for both people and companies. And last but not least, with today’s protest demonstrations, climate change has become a political issue. We will have to see what results from this. At the very least, the catch phrases mentioned above are perfect for developing scenarios for electricity supply security. And it really is the case that the traditional electricity infrastructure, which is often as old as Camille Bauer Metrawatt itself, was not designed for the new types of energy behaviour, either those on the consumer side or the decentralised feed-in side. As a result, it is ever more important to have increasing numbers of intelligent systems which need to work from basic data obtained from precise measurements in order to avoid outages, blackouts and resulting damage.

The overall diversity of these new clusters of topics has prompted Camille Bauer Metrawatt AG to once more face the challenges with courage and above all to do so in an innovative and productive way. In this spirit, Camille Bauer Metrawatt AG develops, produces and distributes its product range globally in 4 technological segments.

These are:
(1) Measurement & Display,
(2) Power Quality,
(3) Control & Monitoring,
(4) Software, Systems and Solutions.

Through its expert staff, modern tools and external partners Camille Bauer Metrawatt is able, for example, to analyse power quality and detect power quality problems. In addition, the Camille Bauer Metrawatt Academy, recently founded in 2019, puts its focus on knowledge transfer by experienced lecturers, with the latest and most important topics as its main priority. Furthermore, we keep in very close contact with customers, authorities, associations, specialist committees, educational institutions, practice-oriented experts and the scientific community in order to continually provide the requisite solutions to the market and interested parties.

#Camille_Bauer_Metrawatt #PAuto @irishpwrprocess


The most viewed Stories in 2019.

02/01/2020
  • MCAA President Teresa Sebring has certified the election of officers and directors of the Measurement, Control & Automation Association…
  • The VP869 high performance 6U OpenVPX FPGA processing board has been announced by Abaco Systems .  Featuring two Xilinx® UltraScale+™ FPGAs …
  • Mr. Uwe Gräff has been appointed to the Board of New Technologies & Quality at the Harting Technology Group. He follows Dr. Frank Brode…
  • ISA demonstrates its undoubted strength again in providing stunning seminars allied with top class training built on member experience. Ne…
  • GE-IP Third Annual Executive Forum on delivering operational excellence. GE Intelligent Platforms recently hosted its third annual execut…
  • Leading monitoring and analysis solution makes improving SQL Server performance easier than ever SolutionsPT has announced a new partners…
  • The International Society of Automation (ISA) has welcomed Paul Gruhn , PE, CFSE, and ISA Life Fellow, is its 2019 Society President. Pa…
  • exida, LLC announced that it has assessed the Honeywell ISA100 Wireless™ Device Manager model WDMY version R320 and certified that it meets…
  • Anglia has continued to make big strides towards full RoHS 3* compliance over six months before the deadline for meeting the new provisions…
  • The emergence of radar has been an important advance in the level measurement field. Radar represents a cost effective, accurate solution that is immune to density and other process fluid changes….

#PAuto #TandM


Most viewed stories in 2018

Why monitor dust?

17/04/2018
Josh Thomas of Ashtead Technology discusses the reasons for monitoring dust in the workplace.

Almost any place of employment can present a potential threat to health and safety from airborne particulates and aerosols. It is important to note, however, that dust hazards are not necessarily visible to the human eye and that the finest particles can represent the greatest threat because of their ability to travel deepest into the lungs. Effective monitoring is therefore key to the implementation of an effective risk management strategy.

There are two major reasons for monitoring dust in the workplace; to enable air quality management, and for regulatory compliance. The immediate effects of dust can be irritation to eyes, headaches, fatigue, coughing and sneezing. As such, poor indoor air quality can lower employee performance and cause increased absenteeism through sickness. In addition, particulates are known to create long-term deleterious effects, contributing to serious illnesses. In combination with outdoor exposure (to pollution form vehicles for example), the Government has estimated that 29,000 premature deaths occur in the UK every year as a result of particle pollution. This means that, particularly in urban areas, natural ventilation may not necessarily improve indoor air quality.

Dust-TrakEmployers are responsible for ensuring that staff and visitors are not exposed to poor air quality in the workplace, so it is necessary to conduct monitoring. Accurate and effective monitoring data can be used to check exposure levels and to help identify safe working practices.

Monitoring also helps to demonstrate compliance with relevant regulations. COSHH is the law that requires employers to control substances that are hazardous to health. According to the Health & Safety Executive (HSE), employers can prevent or reduce workers’ exposure to hazardous substances by finding out what the health hazards are; by deciding how to prevent harm to health; by providing effective control measures; by providing information and training; by providing monitoring and health surveillance, and by planning for emergencies.

In order to evaluate workplace safety, monitoring data is compared with Workplace Exposure levels (WELs) which prescribe the maximum exposure level to a hazardous substance over a set period of time. Failure to comply with COSHH and WELs can result in financial penalties, prosecutions and civil claims.

Indoor air quality is affected by both internal and external factors. Air pollution may arise from external sources such as neighbouring factories, building and development activities, or from vehicles – especially those with diesel engines. Internally, air quality is affected by working practices and business processes. For example, dust may arise from raw materials such as powders, or it may be produced by processes that generate particulates; including dust, mist, aerosols and smoke. In all cases, internal and external, it is important to identify both the source and the seriousness of the problem, so that appropriate and effective mitigation measures can be implemented. These might include, for example, ventilation, process dust prevention, the management of shift patterns, personal protection equipment (PPE) and alarm systems.

Regulatory requirements to monitor
Under the British Workplace (Health Safety and Welfare) Regulations 1992, employers have a legal duty to ensure, so far as is reasonably practicable, the health, safety and welfare of employees. Furthermore, the Management of Health and Safety at Work Regulations 1999 (GB) require employers to assess and control risks to protect their employees. A key element of this is the requirement to comply with the COSHH Regulations. The HSE says that exposure measurement is required:

  • For COSHH assessment, to help select the right controls
  • Where there is a serious risk to health from inhalation
  • To check that exposure limits are not exceeded
  • To check the performance of exposure controls
  • To help select the right respiratory protection equipment
  • To check exposure following a change in a process
  • To show any need for health surveillance; or
  • When an inspector issues an ‘Improvement Notice’ requiring monitoring

The COSSH Regulations include dust, mist, vapour, fumes and chemicals, but they do not cover Lead or Asbestos. Specific requirements exist for certain industries such as construction. Generally, WELs relate to particulate diameter because the health effects of particulates are heavily influenced by their size.

Inhalable dust is that which enters the nose or mouth during breathing and is available for deposition in the respiratory tract. It includes particles with a width between 2.5 and 10 microns (PM2.5 – PM10), and the WEL for this fraction is 10 mg/m3 as an 8-hour Time Weighted Average (TWA).

Respirable dust is the fraction that penetrates deep into the gas exchange region of the lungs. It includes particles with a width between 1 and 2.5 microns (PM1– PM2.5), and the WEL for this fraction is 4 mg/m3 as an 8-hour TWA. Lower specific WELs exist for particulates that present a greater threat to health. For example, Silica dusts have a WEL of just 0.1 mg/m3 respirable dust as an 8-hour TWA.

The costs of non-compliance
In addition to the enormous numbers of premature deaths that result from exposure to outdoor air pollution, there are also numerous well-documented instances demonstrating the harm caused by exposure to indoor pollution from dust, smoke, aerosols and vapour. For example, a 46-year-old cook developed breathing problems after working with flour in a school kitchen with poor ventilation. Her breathing problems became so severe that she could hardly walk and had to sleep sitting up. She became severely asthmatic and had to retire early on health grounds. With the support of her Union she made a compensation claim on the basis that decent working conditions were not provided, and the council admitted that it had not taken sufficient action despite repeated complaints. Consequently, the courts awarded the cook £200,000 (€230k) in damages.

In another example, between 1995 and 2004, a solderer was exposed to rosin based solder fumes and suffered health deterioration and breathing problems including asthma. An investigation conducted by the HSE found that the company did not have adequate control measures in place and failed to install fume extraction equipment. Furthermore, the company did not employ rosin-free solder until December 2003, despite an assessment having identified the need in 1999. The company was subsequently fined £100,000 (€116k) with £30,000 (€35k) costs, a punishment which attracted both local and national media attention.

Monitoring dust
A wide variety of methods exist for the measurement of dust, and the choice of equipment is dictated by the application. For example, it is obviously important to employ a technology that is able to measure the particulates that will be present. In addition, it will be necessary to determine whether monitoring should be continuous, at a single point, or whether portable instruments are necessary to check multiple locations. Monitoring might be conducted in a work space, or personal sampling might be undertaken in order to assess the exposure of an individual over an entire shift.

Personal Sampling Pumps represent the preferred method for workplace exposure monitoring where it is necessary to demonstrate regulatory compliance or where legal dispute is a possibility. An HSE document (MDHS 14/4) provides workplace exposure monitoring guidance for collecting respirable, thoracic and inhalable aerosol fractions. The samples collected by this process are analysed in a laboratory, which means that chemical analysis is also possible. However, the sampling method incurs a delay and incurs extra cost.

In response to the wide variety of applications and monitoring requirements, Ashtead Technology stocks a comprehensive range of monitors for both sale and rental, providing customers with complete financial and technical flexibility. As a TSI Gold Partner, Ashtead Technology provides a comprehensive range of maintenance and calibration services; helping customers to ensure that their monitoring equipment remains in optimal condition. Ashtead’s fleet of rental equipment includes large numbers of the latest TSI instruments, supported by the highest levels of service and technical assistance. Employing advanced light-scattering laser photometers, the TSI products are supplied with a calibration certificate and provide real-time, direct-reading aerosol monitoring and analysis of different particulate fractions in workplace, cleanroom, HVAC, fugitive emissions and environmental monitoring applications.

The TSI range of dust monitors is continually being developed to bring new levels of functionality to the market. For example, the new lightweight AM520 Personal Dust Monitor is able to measure and log PM10, Respirable (PM4), PM5 (China Respirable), PM2.5, PM1 or 0.8μm Diesel Particulate Matter (DPM), providing real-time audible and visual alarms, and running from a rechargeable battery for up to 20 hours. For outdoor applications, the MCERTS approved Environmental DustTrak is web-enabled, providing a quick and easy dust monitoring solution for applications such as building and development projects.

@ashteadtech #PAuto @TSIIncorporated

Robotics: A new revenue source for the taxman?

16/04/2018
Robot tax? A tax on robotics is as absurd an idea as a tax on pencils. As Britain’s political parties discuss a potential tax on automation and robotics, Nigel Smith, managing director of Toshiba Machine partner, TM Robotics, explains why slowing down the machine economy would lead to a productivity disaster.

The world’s first robot tax was introduced in South Korea last year. The tax was created amid fears that a rise in automation and robotics was threatening human workers and could lead to mass unemployment in the country. But, this so-called robot tax was not actually a tax at all. Instead, the country limited tax incentives for investments in automation, lessening the existing tax breaks for automation.

Calling it a tax was simply rhetoric delivered by its opponents. Essentially, it was just a revision of existing tax laws. Regardless of its name, South Korea’s announcement sparked several debates as to whether a robot tax would be advantageous in other countries.

At the time, Bill Gates famously called for a technology levy, suggesting that a tax could balance the Government’s income as jobs are lost to automation. The levy was suggested to slow down the pace of change and provide money for Government to increase job opportunities in other sectors.

Taxing robots?

Fewer workers, fewer tax contributions
While most manufacturers and those operating in the robotics sector would disagree with the idea of a tax on robots, the debate does raise questions of how we tax employment in Britain — and how technology could affect this. The obvious fear at Government level is that if we replace people with robots, we reduce national insurance contributions, lessening a Government’s ability to support its people.

As an alternative, perhaps the answer to this problem is switching to a system where, rather than paying tax per employee through national insurance contributions, NIC was formulated based on a company’s overall operating costs. Using this method, NIC could take account of the impact of all forms of advanced technology, not just robots.

That being said, we are not tax experts at TM Robotics. However, we are experts in industrial robots. We sell industrial robots to manufacturers across the globe and advise them on how robots can increase productivity, efficiency and in turn, create new jobs.

Creating, not destroying jobs
Much of the debate about the potential robot tax has focused on the threat that robots and automation pose to humans. However, we should remember that robots don’t always replace a human job, often they work alongside people to reduce the risk of injury — particularly in the supply chain.

Consider this as an example. TM Robotics recently introduced a robot box opening cell to its range of industrial equipment. This type of automation would typically to be used by companies like DHL and UPS who are delivering product directly into manufacturing plants and retail warehouses to allow them to reduce the risk of injuries from knives. In this instance, a robot tax would undermine a company’s ability to deliver a safe environment for its workers.

The bottom line is that robots create jobs, they don’t take them away. This is supported by the British Government’s recent Made Smarter review on digitalisation in industry. The review concludes that over the next ten years, automation could boost British manufacturing by £455 billion (€525 billion), with a net gain of 175,000 jobs.

Robots are tools and they will create work, especially new kinds of work — taxing them would be a tax on net job creation. Instead of implementing a tax on robots, we should actually be providing tax breaks for companies investing in robotics.

@TMRobotics #PAuto #Robotics @StoneJunctionPR

Bob Lally – Piezoelectric sensing technology pioneer.

27/03/2018

Molly Bakewell Chamberlin, president, Embassy Global LLC pays touching tribute to an important instrument pioneer and innovator. She acknowledges the help of Jim Lally, retired Chairman of PCB Group in preparing this eulogy.

Bob Lally (1924-2018)

During my earliest days in the sensors industry, at PCB Piezotronics (PCB), I can still remember the excitement which accompanied publication of my first technical article. It was a primer on piezoelectric sensing technology, which ran some 15 years ago in the print edition of Sensors. About a month later, I recall receiving a package at PCB, containing both a copy of my article and a congratulatory letter. The article was covered in a sea of post-it notes, filled with new insights and explanatory diagrams. I recall marveling at the sheer kindness of anyone taking such time and interest in the work. I’d sent an immediate thank you, then received yet another encouraging response.  From that time onward, nearly each time I’d publish an article, another friendly envelope would arrive. I’d look forward to them, and the opportunities for learning and growth they’d offered.

As I’d soon come to know, those envelopes were sent by none other than PCB Founder, Bob Lally, who passed away last month at the age of 93. For me, Bob was my PCB pen pal, who along with his brother, Jim, helped me to develop a real appreciation for piezoelectric sensing technology. They made it fun. I also had the privilege of learning quite a bit about this kind, brilliantly complex and insightful person who was so helpful to me. To the sensors industry, Bob’s technical contributions were legendary. What is less known about Bob, however, were his equally remarkable histories, first as a decorated veteran of WW II; and later, as an innovator in STEM.

After graduating from high school in 1942, Bob entered military service, as part of the United States Army which helped liberate mainland Europe during World War II. His service was recognised with two Bronze Stars for bravery. When the hostilities ended, Bob returned home, and was able to benefit from a special U.S. government program which funded the university education of military veterans and their families. This benefit allowed Bob to attend the University of Illinois at Urbana-Champaign, where he earned both Bachelor of Science and Master of Science degrees in Mechanical Engineering with a minor in Mathematics. He graduated with high honors, as University co-salutatorian, in 1950. Bob also later continued this commitment to lifelong learning via studies at both Purdue and the State University of New York at Buffalo (NY USA).

Bob’s first engineering job upon graduation was as a guidance and control engineer at Bell Aircraft Corp. (Bell) in Buffalo, (NY USA). This a position in which he would serve for four years. He worked in test flight control systems R&D for experimental aircraft, glide bombs and guided missiles. He also supervised the inertial guidance group. It was from his work at Bell that Bob first learned about the application of piezoelectric sensing technology for the dynamic measurement of physical parameters, such as vibration, pressure, and force. That technology was first developed by Bob’s colleague, Walter P. Kistler, the Swiss-born physicist who had successfully integrated piezoelectric technology into Bell’s rocket guidance and positioning systems.

Original PCB Piezotronics facility in the family home of Jim Lally, ca 1967. Bob Lally, centre background, is operating a DuMont oscilloscope in the Test department.
Jim Lally, left foreground, leads the Sales department.

In 1955, Bob and some of his Bell colleagues decided to form what was the original Kistler Instrument Company. That company sought to further commercialize piezoelectric sensing technologies for an expanded array of applications and markets, beyond the aerospace umbrella. In addition to his role as co-founder, Bob remained at the original Kistler Instrument Company for 11 years, serving as VP of Marketing, while continuing his roles in engineering, production, testing, and sales. Upon learning that the company was being sold to a firm out of Washington State, Bob decided to form PCB Piezotronics. Established in 1967, PCB specialized in the development and application of integrated electronics within piezoelectric sensors for the dynamic measurement of vibration, pressure, force and acceleration. The original PCB facility had rather humble beginnings, with all sales, marketing, R&D and operations running from the basement of Jim Lally’s family home.

IR-100 Award plaque, presented to Bob Lally, 1983.

It was also in this timeframe that Bob became world-renowned for his capability to successfully integrate piezoelectric sensing technology into mechanical devices, setting a new industry standard for test and measurement. He was awarded multiple U.S. patents for these innovations, including the modally-tuned piezoelectric impact hammer, pendulum hammer calibrator, and gravimetric calibrator, all for the modal impact testing of machines and structures. The modally tuned impulse excitation hammer was further recognized with a prestigious IR-100 award, as one of the top 100 industry technical achievements of 1983.

Bob was also renowned for his successful commercialization of a two-wire accelerometer with built-in electronics. That concept was marketed by PCB as integrated circuit piezoelectric, or ICP. Bob’s 1967 paper for the International Society of Automation (ISA), “Application of Integrated Circuits to Piezoelectric Transducers”, was among the first formally published technical explanations of this concept. As Bob had detailed, the application of this technology made the sensors lower cost, easier to use and more compatible with industrial environments. Subsequent widespread industry adoption of these accelerometers created new markets for PCB, such as industrial machinery health monitoring, and formed a major cornerstone for the company’s success. In 2016, PCB was acquired by MTS Systems Corporation and employs more than 1000 worldwide, with piezoelectric sensing technologies still among its core offerings.

Beyond Bob’s many R&D accomplishments, he is known for his invaluable contributions to the establishment of industry standards and best practices, as a member of the technical standards committees of the Society of Automotive Engineers (SAE), Society for Experimental Mechanics (SEM), and Industrial Electronics Society (IES), among others. Bob also served on the ISA Recommended Practices Committee for Piezoelectric Pressure Transducers and Microphones, as well as the ASA Standards Committee for Piezoelectric Accelerometer Calibration. Many of the standards that Bob helped to develop, as part of these committees, remain relevant today.

Upon retirement, Bob remained committed to the education and training of the next generation of sensors industry professionals. He often gave tutorials and donated instrumentation for student use. Bob later continued that work as an adjunct professor at the University of Cincinnati. In the mid-2000s, he began to develop an innovative series of Science, Technology, Engineering, and Math (STEM) educational models. Each was designed to provide a greater understanding of various sensing technologies, their principles of operation, and “real life” illustrations of practical applications.

STEM sensing model, with adjustable pendulums, by Bob Lally.

Among Bob’s final works was a unique STEM model consisting of three adjustable connected pendulums. That model was used to illustrate the concept of energy flex transference and the influence of physical structural modifications on structural behavior. Bob continued his mentoring and STEM work nearly right up until his passing. He did so with unwavering dedication and enthusiasm, despite being left permanently disabled from his combat injuries.

In addition to co-founding two of the most successful sensor manufacturers in history and his many R&D accomplishments, Bob’s generosity of spirit shall remain an important part of his legacy. I, like many, remain truly grateful for the selfless and meaningful contributions of Bob Lally to my early professional development, particularly in my technical article work. It is an honour to tell his story.

• He is survived by his son, Patrick (Kathi) Lally of Orchard Park, New York; his grandson, Joshua Lally; his surviving siblings, Jim, MaryAnn (Wilson), and Patricia; and his many nieces, nephews, friends and colleagues.

• Special thanks to Jim, Kathi and Patrick Lally for their support and contributions to this article.

• All pictures used hear are by kind courtesy of the Lally family.

VR means low design costs.

27/11/2017

Jonathan Wilkins, marketing director at EU Automation discusses how virtual reality (VR) can be used to improve the design engineering process.

In 1899, Wilbur and Orville Wright, the inventors of the aeroplane, put their first model to flight. They faced several problems, including insufficient lift and deviation from the intended direction. Following a trial flight in 1901, Wilbur said to Orville that man would not fly in a thousand years. Since this occasion, good design has dispelled Wilbur’s theory.

The history of VR
With the invention of computer-aided design (CAD) in 1961, on-screen models could be explored in 3D, unlike with manual drafting. This made it easier for design engineers to visualise concepts before passing their design on for manufacturing.

From there, the technology continued to develop, until we reached cave automatic virtual environment (CAVE). This consisted of cube-like spaces with images projected onto the walls, floor and ceiling. Automotive and aerospace engineers could use CAVE to experience being inside the vehicle, without having to generate a physical prototype.

The latest advancements have introduced VR headsets, also known as head-mounted displays (HMDs) and haptic gloves. They enable users to visualise, touch and feel a virtual version of their design at a lower cost than CAVE technology would allow.

Benefitting design engineers
VR was first used in design engineering by the automotive and aerospace sectors to quickly generate product prototypes for a small cost.

Using the latest technologies, these prototypes can be visualised in real space and from different angles. Engineers can walk and interact with them, and can even make changes to the design from inside the model. This makes it possible to gain a deeper understanding of how the product works and improve the design before it is passed on for manufacturing.

Design engineers can also use VR to identify issues with a product and rectify them before a physical prototype is made. This saves time and money, but also avoids any potential problems that might arise for the end-customer, if the product is manufactured without a design error being rectified.

To study specific parts of a product and understand how it operates in greater detail, engineers often deconstruct prototypes. With physical models, this can be challenging and often leads to several prototypes being made. However, with VR they can be easily pulled apart, manipulated and returned to the original design.

The ergonomics of a product can also be analysed using VR. Decisions can then be made in the early stages of product development to ensure the final product is of the best possible standard.

Furthermore, engineers can use VR to determine whether it will be feasible and affordable to manufacture a product and to plan the manufacturing protocol. This streamlines the product development process and reduces the wasting of materials and time often made with failed manufacturing attempts.

Had VR been available in 1899, the Wright brothers would not have faced so many problems designing the world’s first aeroplane and the outcome would have been achieved much more quickly. Just imagine the designs that VR could help make a reality in the future.


Disinfection robot with robust wireless access.

31/10/2017

STERISAFE-Pro is a disinfection robot from the Danish company INFUSER. It disinfects surfaces in any given room – for example patient rooms, operating theatres or hotel rooms – removing up to 99,9999% of pathogens. The robot fills the designated room with an Ozone-based biocide agent which kills unwanted bacteria, viruses and fungi, while purifying the air from small particulate matter in the air. STERISAFE-Pro is controlled from outside the room using wireless technology from HMS Industrial Networks.

The unit produces Ozone (O3) by using the oxygen (O2) already present in the room. All that is needed is electricity and water. By diffusing Ozone and a fine mist of water, it is possible to expose all surfaces in a room. The Ozone oxidizes the membrane or shell of bacteria, viruses and fungi, leading to total deactivation of these micro-organisms.

The Ozone-saturated atmosphere in the room is sustained for a defined period of time, during which the pathogenic micro-organisms are killed on surfaces and in the air. Ozone naturally turns back to Oxygen after having reacted with pathogens and other pollutants, leaving no chemical residue.

Robust wireless access needed
Although ozone is a naturally occurring gas, it is harmful at high concentration levels and the STERISAFE-Pro requires that the operator is outside the sealed room while the robot runs its cycle. The operator uses a tablet which is connected wirelessly to the PLC inside the robot. INFUSER has created an app which the operator uses to control the robot. The app interfaces with the built-in webserver in the PLC.

OK, so that sounds easy enough, but accessing a PLC which is inside a hermetically sealed, stainless steel machine which performs surface disinfection, demanded a wireless solution with high performance.

Thomas Clapper

“When we first started developing STERISAFE-Pro, we used a regular commercial access point, but we soon realized that we needed something more robust and advanced,” says Thomas Clapper, production responsible at INFUSER.

“We needed an access point that was omni-radiant and also 100% sealed. This is when we came across the Anybus Wireless Bolt from HMS Industrial Networks.”

The Anybus Wireless Bolt™ is a wireless access point for on-machine mounting. It can communicate via WLAN or Bluetooth up to 100 meters and is built for harsh industrial conditions both when it comes to the physical housing and the wireless communication.

It was a perfect fit for STERISAFE.
“We use WLAN to communicate between the PLC inside the robot and the tablet and really benefit from the robust communication that the Wireless Bolt offers. We also needed to design unique connections for each robot/tablet-pair, so that it is possible to run several machines in the same area without radio interference. This is also something that the Anybus Wireless Bolt allowed us to do.”

Wireless Bolt

Tough demands
But the project has not been without challenges. One issue that INFUSER ran into was that Ozone sets tough demands on durability. Although the Wireless Bolt is IP67-classed (meaning that it is waterproof down to 1 meter’s depth), INFUSER still found that the rubber washer on the Bolt was not Ozone proof.

But since the Anybus Wireless Bolt is mounted in a standard M50 hole, it was easy to find a replacement – a washer that HMS now can offer as an alternative to their offering too.

“Implementing the Wireless Bolt was very smooth indeed,” says Thomas Clapper. “We had communication set up in a matter of minutes and have really not had any issues when it comes to the wireless communication. The Wireless Bolt is simply a very reliable and sturdy wireless solution.”

@HMSAnybus #PAuto #Robotics #Wireless

World’s first LiFi enabled light bar!

21/09/2017
Mainstream adoption of LiFi will be available within LED light bars which will replace the most widely utilized light source in the world – fluorescent tubes.

The introduction of the first LED “light bar” is forecasted to replace the most conventional form of lighting within commercial and industrial facilities: fluorescent tubes; with an estimated 3-4 billion installed throughout the world.

pureLiFi and Linmore LED will demonstrate this new technology at LuxLive from the 15-16th of November 2017 (London GB) as part of their LiFi experience zone.

WiFi versus LiFi

Wireless connectivity is evolving. The spectrum now has to accommodate more mobile users and is forecasted to increase to 20 Billion devices (forming the IoT) by the year 2020 which will result in what is known as the Spectrum Crunch. However, LiFi can open up 1000 times more spectrum for wireless communications to combat this phenomenon.  LiFi is a transformative technology changing the way we connect to the Internet by using the same light we use to illuminate our offices, home and streets.

Integration of LiFi within LED strip lights will drive mass adoption, enabling LiFi to easily move into full-scale implementation within offices, schools, warehouses and anywhere illumination is required.

Alistair Banham, CEO of pureLiFi says: “This partnership marks a step change for LiFi adoption. We can now offer new solutions that will help industry, future proof their spaces, devices and technology to ensure they are ready to cope with the increased demand for highspeed, secure and mobile wireless communications.”

LiFi utilizes LED lights that illuminate both our workspace and homes to transmit high-speed, bi-directional, secure and fully networked wireless internet.

What is LiFi
LiFi is high speed bi-directional networked and mobile communication of data using light. LiFi comprises of multiple light bulbs that form a wireless network, offering a substantially similar user experience to Wi-Fi except using the light spectrum.

Lighting manufacturers are important players in the adoption of LiFi technology. Linmore LED has built its reputation in the retrofit market, and they ensure their portfolio of LED products perform in the top 1% for energy efficiency in the industry.

Retrofit fixtures are in great demand as many facilities seek to drive down energy costs by as much as 70-80% which can be achieved by converting to LED technology. This trend is also driven by the increased operating life that LEDs provide and the concerns of toxic mercury utilized within fluorescent lamps that complicates disposal. This provides a scenario where building owners and facility managers can adopt LiFi technology while dramatically decreasing lighting-related energy costs at the same time.

Paul Chamberlain, CEO of Linmore LED says: “Utilizing an existing part of a building’s infrastructure – lighting – opens up endless possibilities for many other technologies to have a deployment backbone.  Internet of Things (IoT), RFID, product and people movement systems, facility maintenance, and a host of other technologies are taken to the next level with LiFi available throughout a facility.”

John Gilmore, Linmore’s VP of Sales talks about early adopters of the technology: “We’re very excited to be aligning ourselves with pure LiFi. We firmly believe the US Government will be an early adopter of this technology. Our position on GSA schedule will help buyers be able to easily access the technology.”

LiFi offers lighting innovators the opportunity to enter new markets and drive completely new sources of revenue by providing wireless communications systems. LiFi is a game changer not only for the communications industry but also for the lighting industry, and with LiFi, Linmore certainly has a brighter future. 

@purelifi #Pauto @LinmoreLED ‏#bes

Sink or swim? Drowning under too much info!

16/06/2017

Rachel Cooper, category marketing manager – field services with Schneider Electric on managing the Big Data Flood.

The Internet of Things (IoT) is constantly in the news. That’s understandable since forecasts anticipate that there will soon be tens of billions of connected devices, helping the IoT sector to generate more than £7.5 trillion worth of economic activity worldwide. In fact, according to McKinsey Global, the IoT economic impact on factories, retail settings, work sites, offices and homes could total as much as £3.55 trillion by 2025.

Oil refinery control room screen

One area where the IoT is driving development is in smart buildings. Today’s more complex buildings are generating vast quantities of data, but building management systems (BMS) are not leveraging that data as much as they could, and are not always capturing the right data to make useful decisions. With 42 per cent of the world’s energy consumed by buildings, facility managers face escalating demand for environmentally friendly, high-performance buildings that are efficient and sustainable.  The data collected can help them to achieve this.

However, many facility managers lack the time and resources to investigate the convenient methods that can help them to turn the flood of IoT and other sensor data they’re exposed to, into actionable insights

Forced to do more with less 
Reduced budgets force building owners to manage sophisticated building systems with fewer resources. This issue is further aggravated by older systems becoming inefficient over time. Even when there is sufficient budget, it is increasingly difficult and time-consuming to hire, develop, and retain staff with the skills and knowledge to take advantage of BMS capabilities.

Facility managers also face challenges maintaining existing equipment performance. Components can break or fall out of calibration, and general wear and tear often leads to a marked decline in a building’s operational efficiency. Changes in building use and occupancy can contribute to indoor air-quality problems, uncomfortable environments, and higher overall energy costs. These changes begin immediately after construction is complete.

Owners often undertake recommissioning projects to fine-tune their buildings. Such work is intended to bring the facility back to its best possible operation level. However, recommissioning is often done as a reactive measure, and traditional maintenance may not identify all areas of energy waste. Operational inefficiencies that are not obvious, or that do not result in occupant discomfort, may go undetected.

Upskilling the current workforce
Many tools have come onto the market over the past decade to help employees get a better understanding of their facilities and assist them in their day-to-day operations and long-term planning. This can include anything from dashboards and automated analytics platforms to machine-learning optimisation engines. However, much like the sophisticated BMS platforms available today, for each tool you deploy, more investment is needed in time for training. In fact, research shows that lacking training is evident with roughly only 20 per cent of facility managers using 80 per cent of capabilities available to them within their BMS. The remaining 80 per cent use a very limited amount (20 per cent) of the potential functionality in their system.

With personnel turnover and competing facility-management responsibilities, many facilities are left without staff who have the time to learn the full capabilities of these tools. Of course, outsourcing different functions is one way to overcome these issues. However, vendors must be managed closely to ensure efficacy, and to ensure that outsourcing costs do not accrue significantly as third parties spend more time on-site.

In tech we trust
Technology has become an important part of building management, as BMS play an ever bigger role in how facility managers perform their jobs and operate buildings. Newer technologies like data visualisation dashboards let facility managers view building performance metrics in a single window, helping them to spot trends and gather insights. By visualising data in terms of graphs, charts, and conversion to different equivalents – for example, kWh to pound cost or kWh to carbon footprint, an experienced building operator can manually identify areas of concern for closer inspection.

Yet, while dashboards can be helpful in determining building behaviour, the data is often complex and challenging to interpret. In fact, even if building staff have the time and skills to review and understand the data, dashboard information alone tells only part of the building performance story. Facility managers can identify where inefficiencies exist but usually not why. This requires additional troubleshooting and investigation. Therefore, dashboards are most effective for simple monitoring in environments where there are plenty of trained staff to perform troubleshooting and identify the root causes of issues.

Analytics is the answer 
To gain more from a BMS deployment, many facility managers are turning to data analytics software to interpret large volumes of BMS data. Best-in-class software automatically trends energy and equipment use, identifies faults, provides root-cause analysis, and prioritises opportunities for improvement based on cost, comfort and maintenance impact. This software complements BMS dashboards because it takes the additional step of interpreting the data – showing not just where but why inefficiencies occur. Engineers can then convert this intelligence into “actionable information” for troubleshooting and preventative maintenance, as well as for solving more complicated operational challenges. 

Using this software, facility managers can proactively optimise and commission building operations more effectively than with a BMS alone. It enables them to understand why a building is or isn’t operating efficiently so that they can introduce permanent solutions rather than temporary fixes. For instance, with data analytics, facility managers can proactively identify operational problems such as equipment that needs to be repaired or replaced. Moreover, it can do this before critical failure and before it has an impact on the building occupants. Repairs can be scheduled before an emergency arises, eliminating costly short-notice or out-of-hours replacement and avoiding failure and downtime. With this proactive approach, equipment becomes more reliable, the cost of replacement and repair can be much lower, and occupants are assured of optimal comfort. In fact, by following best practice, they can even reduce HVAC energy costs by up to 30%.

The Future
Smart, connected technology has taken us beyond the human ability to manage what can amount to hundreds of thousands of data points in large buildings. Efficient operations require a proactive response. Analytics solutions effectively manage the new state of information overload created by a digital world and filter out what’s not valuable to you. For example, they can provide insight on how to fix problems when they are first observed, before total failure. This predictive maintenance approach means capital assets can be preserved and significant energy savings can be made. The advent of IoT means that we must shift our approach to facility management in order to deliver against the financial, wellbeing and sustainability targets of today’s facilities. By investing in a sophisticated BMS, users can uncover which data to ignore and which to act upon. After all, data for data’s sake is useless. Being able to use a building’s performance data to augment operational efficiency, increase occupant comfort, and improve overall energy consumption so that the financial well-being of buildings can be sustained, is of paramount importance.

@SchneiderElec #PAuto #IoT