Automate image-based inspection with artificial intelligence.

02/09/2020

High demands on products as well as high time and cost pressure are decisive competitive factors across all industries and sectors. Whether in the food or automotive industry quality, safety and speed are today more than ever before factors that determine the success of a company.

Zero-defect production is the goal. But how can it be guaranteed that only flawless products leave the production line? In order to make quality inspection as efficient, simple, reliable and cost-effective as possible, the German company sentin GmbH develops solutions that use deep learning and industrial cameras from IDS to enable fast and robust error detection. A sentin VISION system uses AI-based recognition software and can be trained using a few sample images. Together with a GigE Vision CMOS industrial camera from IDS and an evaluation unit, it can be easily embedded in existing processes.

High demands on products as well as high time and cost pressure are decisive competitive factors across all industries and sectors. Whether in the food or automotive industry – quality, safety and speed are today more than ever before factors that determine the success of a company. Zero-defect production is the goal. But how can it be guaranteed that only flawless products leave the production line? How can faulty quality decisions, which lead to high costs, be avoided? In order to test this reliably, a wide variety of methods are used in quality assurance.

A visual inspection with the human eye is possible, but it is often error-prone and expensive: the eye tires and working time is costly. A mechanical test, on the other hand, is usually accompanied by complex calibration, i.e. setting up and adjusting all parameters of both software and hardware in order to detect every error. In addition, product or material changes require recalibration. Furthermore, with the classic, rule-based approach, a programmer or image processor must program rules specifically for the system to explain to the system how to detect the errors. This is complex and with a very high variance of errors often a hardly solvable Herculean task. All this can cost disproportionately much time and money.

In order to make quality inspection as efficient, simple, reliable and cost-effective as possible, sentin GmbH uses IDS industrial cameras and deep learning to develop solutions that enable fast and robust error detection. This is because, in contrast to conventional image processing, a neural network learns to recognize the features on the basis of images themselves. This is exactly the approach of the intelligent sentin VISION system. It uses an AI-based recognition software and can be trained on the basis of a few sample images. Together with a GigE Vision CMOS industrial camera from IDS and an evaluation unit, it can be easily embedded in existing processes.

Application
The system is capable of segmenting objects, patterns and even defects. Even surfaces that are difficult to detect cannot stop the system. Classical applications can be found, for example, in the automotive industry (defect detection on metallic surfaces) or in the ceramics industry (defect detection by making dents visible on reflecting and mirroring surfaces), but also in the food industry (object and pattern recognition).

Depending on the application, the AI is trained to detect errors or anomalies. With the latter, the system learns to distinguish good from bad parts. If, for example, a surface structure is inspected, see metal part in the automotive industry or ceramic part, errors are detected by Artificial Intelligence as deviations from a comparison with reference images. By using anomaly detection and pre-trained models the system can detect defects based on just a few sample images of good parts.

The hardware setup required for the training and evaluation consists of an IDS industrial camera and appropriate lighting. The recognition models used are trained using reference images. For example, a system and AI model was configured for the error-prone inspection of fabric webs in the textile industry. A difficult task, as mistakes can be very subjective and very small. The system camera for optimum image material of textiles and web materials was selected together with IDS on the basis of specific customer requirements. A GigE Vision CMOS camera (GV-5880CP) was selected, which provides high-resolution data, triggered with precise timing, for accurate image evaluation.

The system learns what constitutes a “good” fabric structure and knows already from a few shots of the fabric what a clean and flawless product looks like. For quality inspection, the image captured by the IDS Vision CP camera is then forwarded via GigE interface to an evaluation computer and processed with the recognition model. This computer can then reliably distinguish good/bad parts and highlight deviations. It gives an output signal when an error is found. In this way, slippage and pseudo rejects can be reduced quickly and easily.

Slippage is the proportion of products that do not meet the standard but are overlooked and therefore not sorted out, often leading to complaints. Pseudo rejects, on the other hand, are those products that meet the quality standard but are nevertheless incorrectly sorted out.

Both hardware and software of the system are flexible: For multiple or wider webs, additional cameras can easily be integrated into the setup. If necessary, the software also allows for re-training of the AI models. “Experience simply shows that a certain amount of night training is always necessary due to small individual circumstances. With pre-trained models from our portfolio, you need fewer reference images for individualization and post training,” explains Christian Els, CEO and co-founder of sentin. In this case, the images show the structured surface of a fabric and a small anomaly on it, which was filtered out in the image on the right:

Anomaly extracted from a recording of a substance (sentin GmbH)

Camera
Extremely accurate image acquisition and precise image evaluation are among the most important requirements for the camera used. Perfectly suitable: The GigE Vision CMOS camera GV-5880CP. The model has a 1/1.8″ rolling shutter CMOS sensor Sony IMX178, which enables a very high resolution of 6.4 MP (3088 x 2076 px, aspect ratio 3:2). It delivers frame rates of up to 18 fps at full resolution and is therefore ideal for visualization tasks in quality control. The sensor from the Sony STARVIS series features BSI technology (“back-side-illumination”) and is one of the most light-sensitive sensors with a low dark current close to the SCMOS range (Scientific CMOS). It ensures impressive results even under very low light conditions. Thanks to the sensor size of 1/1.8″, a wide range of C-Mount lenses is available for the GigE Vision camera model GV-5880CP. “In addition to resolution and frame rate, the interface and the price were decisive factors in the decision for the camera. The direct exchange with the IDS development department has helped us to reduce the time needed for camera integration,” says Arkadius Gombos, Technical Manager at sentin. The integration into the sentin VISION system is done via GenTL and a Python interface.

The GigE Vision camera GV-5880CP from IDS ensures precise image acquisition and accurate image evaluation when inspecting fabric webs. (sentin GmbH)

Conclusion
Automated, image-based quality control with Artificial Intelligence offers many advantages over human visual inspection or conventional machine vision applications. “In AI-based image interpretation, the aim is to create images on which humans can see the error, because then the AI model can do it too,” concludes Christian Els. The system learns to recognize the requirements of the product similar to a human being. But the human brain is beaten at any time by an artificial intelligence in terms of consistency and reliability. Even if the brain is capable of remarkable peak performance, an AI can recognize much more complex error patterns. The human eye, on the other hand, cannot stand up to any camera in terms of fatigue and vision. In combination with deep-learning recognition software, the image processing system therefore enables particularly fast and accurate inspection. Depending on the application, image acquisition and evaluation can take place in just a few milliseconds.

The system can also be applied to other areas such as surface testing. Similar applications are e.g. the testing of matte metal/coatings surfaces (automotive interior), natural materials (stone, wood) or technical textiles such as leather. Scratches, cracks and other defects on consumer goods can thus be detected and the respective products sorted out. Exclude quality defects and produce only “good stuff” – an indispensable process within the framework of quality assurance. IDS cameras in combination with the deep learning supported software of sentin GmbH significantly optimize the detection of defects and objects in quality control. This allows the personnel and time expenditure for complaints and rework, as well as pseudo rejects, to be significantly reduced in a wide range of industries and areas.

• See information on other IDS Imaging products. – published on the Read-out Signpost.

@sentin_ai @IDS_Imaging #mepaxIntPR #PAuto #Food


Robotics tackling pandemics.

19/06/2020
Poll for UK Robotics Week reports that over 1 in 3 British adults believe robotics could help manufacture PPE, while over a third think that robot deliveries could aid social distancing.

One in three British adults see a key role for the use of robotics in tackling the COVID-19 crisis and future pandemics, research released today reveals. The public poll*, commissioned by the EPSRC UK Robotics and Autonomous Systems (UK-RAS) Network, is being released ahead of the annual UK Robotics Week, which returns for its fifth year from 22nd – 28th June 2020.  

36% of a representative sample of British adults believe that robotics technology could help to ramp up the manufacture of Personal Protective Equipment (PPE), while 33% feel that robot deliveries and the use of Unattended Aerial Vehicles (UAVs) could aid social distancing during public health crises such as the current global pandemic1.  28% of those polled also think that robotics could play a vital role in automating the cleaning and disinfecting of public places.

The survey reports that the manufacturing sector tops the list of industries in which people think robotics are most useful, highlighted by 42% of respondents, ahead of logistics (30%) and military and defence (20%).  While just under a fifth of those polled (17%) indicated that robotics should be most used in the medical sector, the medical field is also where most people (38%) expect to see the most rapid advancements in the next 12 months. A surge in robotics innovation is also anticipated by the public in 3D printing (34%), logistics (30%) and in the household (29%).

Other key findings from the research include:

– Almost one in five (19%) adults think that robotics should replace people doing physical work

– Whilst 56% of people have stayed as trusting since last year towards robotics, 16% of people have become more trusting

Professor Robert Richardson

Professor Robert Richardson, Chair of the EPSRC UK-RAS Network, comments: “These findings from our latest survey into attitudes towards robotics show that the public is taking a real interest in how robotics technology is developing, and the benefits of using robots across a gamut of sectors. Throughout the COVID-19 pandemic, we’ve seen examples of specific tasks that robots are able to carry out while removing humans from risk – including disinfecting spaces and transporting medical supplies and food around hospitals – and UK Robotics Week offers a fantastic opportunity to explore how robotic systems can both contribute to our everyday life and work, and also help us prepare for and adapt to unexpected events.”

UK Robotics Week is organised annually by the EPSRC UK-RAS Network, which was founded in 2015 to bring cohesion to the robotics and autonomous systems research base, enhance capital facilities across the country, and support education programmes and public engagement activities.  This year’s programme is showcasing the state-of-the-art in robotics systems research and development and includes prestigious academic challenges and engaging school competitions.  New for this year is the Medical Robotics for Contagious Diseases Challenge, which invites the leading robotics research teams from across the world to submit innovative ideas that could offer solutions as part of a multi-faceted response to the current COVID-19 health crisis and future global pandemics.

* Research carried out online by Opinion Matters throughout 07/06/2020 to 11/06/2020 amongst a panel resulting in 2,014 UK adults responding. All research conducted adheres to the MRS Codes of Conduct (2010) in the UK and ICC/ESOMAR World Research Guidelines. Opinion Matters is registered with the Information Commissioner’s Office and is fully compliant with the Data Protection Act (1998).

@UKRobotics @Rob8Richardson @NeonDrum #Robotics #Health #UKRW20 #coróinvíreas #COVID19 #coronavirus


Compulsory robotics education.

21/06/2019

Robotics education should be made part of the school curriculum, according to a fifth of British adults polled to mark the official start of UK Robotics Week (20th – 27th June 2019).  20% of a representative sample of adults want compulsory lessons to be given in the nation’s schools so that children can fully engage with learning about robotics technology 1.  A quarter (25%) of respondents also say that they are now starting to see the benefits of using robotics in their daily lives.

UK-Robotics-Week-International-Robotics-Showcase_4

The research was commissioned by the EPSRC UK Robotics and Autonomous Systems (UK-RAS) Network to mark the start of the 4th annual UK Robotics Week, which returns this week for a spectacular celebration of robotics and autonomous systems innovation, and is boosted by the recent award of a further three-year grant by the Engineering and Physical Sciences Research Council (EPSRC) to the organising EPSRC UK-RAS Network. 

With robotics continuing to inspire stories across popular culture, the survey also reports that 1 in 5 (20%) people would love to see the ‘Iron Man’-style suit in real life, when polled about which robotic developments from films they would love to see.  

In the wider context, the British public feel increased use of robotics would most directly benefit industries such as manufacturing (38%), military and defence (28%), construction (25%) as well as the medical sector (25%).

Other key findings highlighted in the research include:

•        Almost a fifth (18%) of survey respondents said they believe that using robotics is essential for our society and that we need to depend on them more.

•        Over a third of UK adults (34%) believe that robots will take over boring, repetitive jobs allowing humans to do more interesting, fulfilling work.

•        Over 1 in 5 (23%) Brits would like to learn more about the robotics industry and its developments.

Commenting on the survey findings, Professor Guang-Zhong Yang, Chair of the EPSRC UK-RAS Network, said: “We are really pleased to see this range of positive public attitudes in relation to robotics technology, especially with regard to the desire for a key educational focus in this area. We launched UK Robotics Week to provide a focus for showcasing the UK’s innovation strengths in this increasingly vital sector, and to stimulate a national discussion about robotics and its developing role in our society.  We hope to engage and inspire many more people to explore robotics technology during this week’s activities, culminating in our much-anticipated International Robotics Showcase.” 

UK Robotics Week 2019 is highlighting a wide programme of events nationwide, spanning competitions and challenges to public lectures, open days, symposia, hackathons and workshops. The centrepiece of UK Robotics Week will be the International Robotics Showcase, being hosted on Thursday 27th June at the Royal Geographical Society in London. This highlight one-day exhibition and conference will feature live demonstrations of the latest robotic technologies, the winners of this year’s Surgical and Manufacturing Robotics Challenges, plus a full conference programme hosting talks by world-renowned technologists and experts exploring cutting-edge advances in nuclear robotics, offshore robotics, future AI and robotics for space and much more.

1.  Research carried out online by Opinion Matters throughout 07/06/2019 to 11/06/2019 amongst a panel resulting in 2,014 UK adults responding. All research conducted adheres to the MRS Codes of Conduct (2010) in the UK and ICC/ESOMAR World Research Guidelines. Opinion Matters is registered with the Information Commissioner’s Office and is fully compliant with the Data Protection Act (1998).

@UKRobotics #PAuto

Power needs for autonomous robots!

20/08/2018
Jonathan Wilkins, marketing director at EU Automation, argues that the way we power six axis robots needs to be reassessed to meet the needs of new applications such as autonomous mobile robots (AMRs).

Since industrial six-axis robots were popularised back in the 1960s, the technology that makes up robots, as well as the way in which we now use robots, has changed considerably.

What was once considered a high-risk sector, where robots were relegated to operating in cells and cages behind no-go zones, has changed to one where robots can now work in collaboration with human workers.

Advances in motor technology, actuation, gearing, proximity sensing and artificial intelligence has resulted in the advent of various robots, such as CoBots, that are portable enough to be desktop mounted, as well as autonomous mobile robots (AMRs) that can move freely around a facility.

These systems are not only capable of delivering high payloads weighing hundreds of kilograms, but are also sensitive enough to sense the presence of a human being at distances ranging from a few millimetres to a few metres. The robot can then respond in under a millisecond to stimuli, such as a person reaching out to guide the robot’s hand, and automatically change its power and force-limiting system to respond accordingly.

Although six-axis robots and CoBots are predominantly mains powered, portable AMR service robots are gaining popularity in sectors as diverse as industrial manufacturing, warehousing, healthcare and even hotels. In these settings, they can operate 24/7, only taking themselves out of action for charging and taken offline by an engineer for essential repairs and maintenance.

In the warehousing sector, for example, the picking and packing process can be manually intensive, with operators walking up and down long aisles picking products from a shelf to fulfil each order. This is a time consuming and inefficient process that adds time to the customer order. Using an autonomous mobile robot in this situation can allow the compact robot to pick up the shelf and move it to the human operator in true “goods-to-man” style.

However, this demanding use-cycle prompts the question: are the batteries that power these robots sufficiently suited to this new environment? To answer this, we need to understand the types of batteries used.

The two most popular types of secondary, rechargeable, battery are sealed lead acid (SLA) and lithium-ion (li-ion). Having been around for nearly 160 years, lead acid technology is capable of delivering high surge currents due to its low impedance. However, this type of battery can be large and heavy, making it impractical for smaller machines.

Alternatively, lithium-ion provides the highest density and delivers the highest energy-to-weight ratio of any battery chemistry, which allows design engineers to use it in even the most compact devices. It also maintains a stable voltage throughout its discharge cycle, resulting in highly efficient, long runtimes.

When choosing robotic systems for their application, it’s important that engineers match the right type of battery to the load. As we increasingly begin to rely on smart factories with high levels of portable and mobile automation, considering the power needs of each device will be vital in delivering long run times with high efficiency.

@euautomation #PAuto #Robotics

Robotics: A new revenue source for the taxman?

16/04/2018
Robot tax? A tax on robotics is as absurd an idea as a tax on pencils. As Britain’s political parties discuss a potential tax on automation and robotics, Nigel Smith, managing director of Toshiba Machine partner, TM Robotics, explains why slowing down the machine economy would lead to a productivity disaster.

The world’s first robot tax was introduced in South Korea last year. The tax was created amid fears that a rise in automation and robotics was threatening human workers and could lead to mass unemployment in the country. But, this so-called robot tax was not actually a tax at all. Instead, the country limited tax incentives for investments in automation, lessening the existing tax breaks for automation.

Calling it a tax was simply rhetoric delivered by its opponents. Essentially, it was just a revision of existing tax laws. Regardless of its name, South Korea’s announcement sparked several debates as to whether a robot tax would be advantageous in other countries.

At the time, Bill Gates famously called for a technology levy, suggesting that a tax could balance the Government’s income as jobs are lost to automation. The levy was suggested to slow down the pace of change and provide money for Government to increase job opportunities in other sectors.

Taxing robots?

Fewer workers, fewer tax contributions
While most manufacturers and those operating in the robotics sector would disagree with the idea of a tax on robots, the debate does raise questions of how we tax employment in Britain — and how technology could affect this. The obvious fear at Government level is that if we replace people with robots, we reduce national insurance contributions, lessening a Government’s ability to support its people.

As an alternative, perhaps the answer to this problem is switching to a system where, rather than paying tax per employee through national insurance contributions, NIC was formulated based on a company’s overall operating costs. Using this method, NIC could take account of the impact of all forms of advanced technology, not just robots.

That being said, we are not tax experts at TM Robotics. However, we are experts in industrial robots. We sell industrial robots to manufacturers across the globe and advise them on how robots can increase productivity, efficiency and in turn, create new jobs.

Creating, not destroying jobs
Much of the debate about the potential robot tax has focused on the threat that robots and automation pose to humans. However, we should remember that robots don’t always replace a human job, often they work alongside people to reduce the risk of injury — particularly in the supply chain.

Consider this as an example. TM Robotics recently introduced a robot box opening cell to its range of industrial equipment. This type of automation would typically to be used by companies like DHL and UPS who are delivering product directly into manufacturing plants and retail warehouses to allow them to reduce the risk of injuries from knives. In this instance, a robot tax would undermine a company’s ability to deliver a safe environment for its workers.

The bottom line is that robots create jobs, they don’t take them away. This is supported by the British Government’s recent Made Smarter review on digitalisation in industry. The review concludes that over the next ten years, automation could boost British manufacturing by £455 billion (€525 billion), with a net gain of 175,000 jobs.

Robots are tools and they will create work, especially new kinds of work — taxing them would be a tax on net job creation. Instead of implementing a tax on robots, we should actually be providing tax breaks for companies investing in robotics.

@TMRobotics #PAuto #Robotics @StoneJunctionPR

No escape even for agrochemicals!

28/09/2017
In this article key points that are covered in depth in the IDTtechEX published report “Agricultural Robots and Drones 2017-2027: Technologies, Markets, Players” by Dr Khasha Ghaffarzadeh and Dr Harry Zervos are discussed. 

New robotics is already quietly transforming many aspects of agriculture, and the agrochemicals business is no exception. Here, intelligent and autonomous robots can enable ultraprecision agriculture, potentially changing the nature of the agrochemicals business. In this process, bulk commodity chemical suppliers will be transformed into speciality chemical companies, whilst many will have to reinvent themselves, learning to view data and artificial intelligence (AI) as a strategic part of their overall crop protection offerings.

Computer vision
Computer vision is already commercially used in agriculture. In one use case, simple row-following algorithms are employed, enabling a tractor-pulled implement to automatically adjust its position. This relieves the pressure on the driver to maintain an ultra-accurate driving path when weeding to avoid inadvertent damage to the crops.

The computer vision technology is however already evolving past this primitive stage. Now, implements are being equipped with full computer systems, enabling them to image small areas, to detect the presence of plants, and to distinguish between crop and weed. The system can then instruct the implement to take a site-specific precision action to, for example, eliminate the weed. In the future, the system has the potential to recognize different crop and weed types, enabling it to take further targeted precision action.

This technology is already commercial, although at a small scale and only for specific crops. The implements are still very much custom built, assembled and ruggedized for agriculture by the start-ups themselves. This situation will continue until the market is proven, forcing the developers to be both hardware and software specialists. Furthermore, the implements are not yet fully reliable and easy to operate, and the upfront machine costs are high, leading the developers to favour a robotic-as-a-service business model.

Nonetheless, the direction of travel is clear: data will increasingly take on a more prominent (strategic) role in agriculture. This is because the latest image processing techniques, based on deep learning, feed on large datasets to train themselves. Indeed, a time-consuming challenge in applying deep learning techniques to agriculture is in assembling large-scale sets of tagged data as training fodder for the algorithms. The industry needs its equivalents of image databases used for facial recognition and developed with the help of internet images and crowd-sourced manual labelling.

In not too distant a future, a series of image processing algorithms will emerge, each focused on some set of crop or weed type. In time, these capabilities will inevitably expand, allowing the algorithms to become applicable to a wider set of circumstances. In parallel, and in tandem with more accumulated data (not just images but other indicators such NDVA too), algorithms will offer more insight into the status of different plants, laying the foundation of ultra-precision farming on an individual plant basis.

Agriculture is a challenging environment for image processing. Seasons, light, and soil conditions change, whilst the plant themselves transform shape as they progress through their different stages of growth. Nonetheless, the accuracy threshold that the algorithms in agriculture must meet are lower than those found in many other applications such as autonomous general driving. This is because an erroneous recognition will, at worse, result in elimination of a few healthy crops, and not in fatalities. This, of course, matters economically but is a not safety critical issue and is thus not a showstopper.

This lower threshold is important because achieving higher levels of accuracy becomes increasingly challenging. This is because after an initial substantial gain in accuracy improvement the algorithms enter the diminishing returns phase where lots more data will be needed for small accuracy gains. Consequently, algorithms can be commercially rolled out in agriculture far sooner, and based on orders of magnitude lower data sizes and with less accuracy, than in many other applications.

Navigational autonomy
Agriculture is already a leading adapter of autonomous mobility technology. Here, the autosteer and autoguide technology, based on outdoor RTK GPS localization, are already well-established. The technology is however already moving towards full level-5 autonomy. The initial versions are likely to retain the cab, enabling the farmer/driver to stay in charge, ready to intervene, during critical tasks such as harvesting. Unmanned cable versions will also emerge when technology reliability is proven and when users begin to define staying in charge as remote fleet supervision.

The evolution towards full unmanned autonomy has major implications. As we have discussed in previous articles, it may give rise to fleets of small, slow, lightweight agricultural robots (agrobots). These fleets today have limited autonomous navigational capability and suffer from limited productivity, both in individual and fleet forms. This will however ultimately change as designs/components become standardized and as the cost of autonomous mobility hardware inevitably goes down a steep learning curve.

Agrobots of the future
Now the silhouette of the agrobots of the future may be seen: small intelligent autonomous mobile robots taking precise action on an individual plant basis. These robots can be connected to the cloud to share learning and data, and to receive updates en mass. These robots can be modular, enabling the introduction of different sensor/actuator units as required. These robots will never be individually as productive as today’s powerful farm vehicles, but can be in fleet form if hardware costs are lowered and the fleet size-to-supervisor ratio is increased.

What this may mean for the agrochemicals business is also emerging. First, data and AI will become an indispensable part of the general field of crop protection, of which agrochemical supply will become only a subset, albeit still a major one. This will mandate a major rethinking of the chemical companies’ business model and skillsets. Second, non-selective blockbuster agrochemicals (together with engineered herbicide resistant seeds) may lose their total dominance. This is because the robots will apply a custom action for each plant, potentially requiring many specialized selective chemicals.

These will not happen overnight. The current approach is highly productive, particularly over large areas, and off-patent generic chemicals will further drive costs down. The robots are low-lying today, constricting them to short crops. Achieving precision spraying using high boys will be a mechanical and control engineering challenge. But these changes will come, diffusing into general use step by step and plant by plant. True, this is a long term game, but playing it cannot be kicked into the long grass for long.

@IDTechEx #Robotics #Agriculture #PAuto

Is AI all it is cracked up to be?

28/03/2017
In this article, Stephen Parker, CEO of Parker Software, examines whether artificial intelligence is all it’s cracked up to be.

If planet Earth had been created one year ago, the human species would be just ten minutes old. Putting this into context, the industrial era would have kick-started a mere two seconds ago. Thanks to human influence, the pace of technological advancement on Earth is astonishing. However, we are already on the verge of the next change. The potential of artificial intelligence has been discussed by scientists since the 1950s and modern technological advances are finally bringing this technology to the masses. 

Research suggests that artificial intelligence could be as ‘smart’ as human beings within the next century. Originally, human programmers were required to handcraft knowledge items painstakingly. Today, however, one-off algorithms can teach machines to take on and develop knowledge automatically, in the same way a human infant would. Artificial intelligence has reached a critical tipping point and its power is set to impact every business, in every industry sector.

Already, 38 per cent of enterprises are using artificial intelligence in their business operations and this figure is set to grow to 62 per cent by 2018. In fact, according to predictions by Forrester, investments in artificial intelligence technology will increase three-fold in 2017. These figures mean that the market could be worth an estimated $47 billion by 2020. 

Intelligent assistance
One of the most notable applications of AI from the past few years is the creation of intelligent assistants. Intelligent assistants are interactive systems that can communicate with humans to help them access information or complete tasks. This is usually accomplished with speech recognition technology; think Apple’s Siri, Microsoft’s Cortana or Amazon’s Alexa. Most of the intelligent assistants that we are familiar with today are consumer facing and are somewhat general in the tasks they can complete. However, these applications are now making their way into more advanced customer service settings.

While there is certainly a space for these automated assistants in the enterprise realm, there is a debate as to whether this technology could fully replace a contact centre agent.

Automation is widely recognised as a valuable tool for organisations to route the customer to the correct agent. However, completely handing over the reins of customer management to a machine could to be a step too far for most businesses. Even the most advanced AI platforms only hold an IQ score equivalent to that of a four-year-old, and naturally, businesses are unlikely to entrust their customer service offering to a child.

The human touch
Automated processes are invaluable for speeding up laborious processes and completing monotonous customer service tasks. But as any customer service expert will tell you, the human touch is what elevates good service to an excellent experience for the customer. Simple tasks will no doubt be increasingly managed and completed using automation and AI-enabled agent support systems, whereas complex issues will still require the careful intervention of a human agent.

During a TED Talk on artificial intelligence, philosopher and technologist Nick Bostrom claimed that “machine intelligence is the last invention that humanity will ever need to make.” However, contact centre agents needn’t hang up their headsets just yet.  Artificial intelligence won’t be replacing the call centre agent any time soon. The only guarantee is that the role of a call centre agent will continue to evolve after all, the industrial revolution was only two seconds ago.

@ParkerSoftware #PAuto

It IS rocket science!

13/03/2017

Graham Mackrell, managing director Harmonic Drive, explains why its strain wave gears have been the top choice in space for over forty years.

Anything that goes into space is seen as the pinnacle of human creation. Astronauts are highly trained and are at the peak of physical fitness, space shuttles are crafted by large teams of expert engineers and all the technology used is so high-tech it’s as if it belongs to science fiction.

Driving on Mars!

Many decades ago, the first Harmonic Drive gears were sent into space during the Apollo 15 mission. Even from the beginnings of the space race, the expectations for the technology used were high. The equipment used in space had to be reliable, compact and lightweight and given the increasing demands on equipment in today’s space missions, it must also now be highly accurate with zero backlash and have high torque capacity.

When aerospace engineers were recently designing a new space rover, they looked to Harmonic Drive gears for reliability. Due to the obvious difficulties of performing repairs in space, a high mean time between equipment failures is a high priority. Harmonic Drive products achieve this by prioritising quality throughout the entire design and manufacturing process.

It is vital that aerospace gears are thoroughly tested before they are sent to customers, ensuring that they always receive a quality product. At Harmonic Drive, we test products using finite element method (FEM) testing. This process simulates real world physics to ensure that the product is capable of surviving in space. For example, structural testing is carried out to ensure the product is robust and the space rover travelling over rough terrain will not damage the actuators used in the wheels. Thermodynamic properties are also important as aerospace gears are often exposed to both extremes of the temperature range, which are tested in the initial design process.

Also considered in the design process is the part count of the aerospace gears. Harmonic uses a low part count which means that they are maintenance free. In addition, there is a lower chance of components failing giving the gears a high Mean Time Between Failure (MTBF). This also contributes to the compactness and light weight of the gears, a feature essential in space.

Another key feature for aerospace gears is high torque capacity and zero backlash. This is essential for systems which communicate the location of the rover to the control room. If traditional, high backlash gears were to be used, the system would misreport the rover’s location. This would cause problems when the rover is used to survey uncharted areas of planets and could lead to inaccurate mapping. Due to the emphasis on high precision with Harmonic Drive gears, this problem can be avoided.

The numerous quality processes that Harmonic Drive undertakes have led to recognition from a number of accrediting bodies. Harmonic Drive products are AS9100 certified, a specific aerospace standard for the design, manufacture and sale of precision gear reducers, servo-actuators and electro-mechanical positioning systems.

To be the pinnacle of global technology, there are no shortcuts. Components used in aerospace technology must be subject to vigorous testing in order to be reliable, safe and have a long product life.

• The MARS adventure: The NASA site.
@HarmonicDriveUK #PAuto #Robotics @StoneJunctionPR

Preparing pharmaceutical and medical technology for the future.

27/09/2016

Production environment requirements in the pharmaceutical and medical technology sectors are very high and producers need to keep abreast of current industry trends. Such trends include; process optimisation for the purpose of increasing overall equipment effectiveness (OEE), effective asset life cycle management and predictive maintenance using enterprise mobility and intelligent solutions (smart apps). Increasing networking along with the use of automation technology in accordance with Industry 4.0 have paved the way for these developments.

Industry 4.0
The Internet of Things has found its way into production in the form of Industry 4.0: increasingly networked systems with more communication-capable components have meant an ever-increasing volume of data. Thanks to “big data”, production is being made more and more intelligent, with the pinnacle of achievement being the smart factory. The key to success lies in determining the right information from the mass of data available, analysing the data and drawing findings from them. The aim of such Smart Data Management is to optimise the plant in question and prepare it for the future in terms of site operational excellence.

Robots are increasingly taking on handling tasks to reduce human effort in the medical/pharma sectors, for example by supplying filled syringes to the end-of-line packaging station.

Robots are increasingly taking on handling tasks to reduce human effort in the medical/pharma sectors, for example by supplying filled syringes to the end-of-line packaging station.

In addition, overall equipment effectiveness (OEE) has developed into an obvious trend. The main factors that have contributed towards this have been optimised plant utilisation and productivity for which manufacturing execution systems (MES) and enterprise resource planning (ERP) are of vital importance. Interfaces such as the MES interface from Mitsubishi Electric enable data to be collected quickly and easily at plant level and transferred to higher-level MES or ERP systems at the control level for further analysis. OEE can be specifically optimised based on the results without the need for a gateway PC to transfer the data. Based on the System Q PLC from Mitsubishi Electric the MES interface can be configured by a plant engineer in just around 15 minutes.

Cost control strategies look for a more compact design, shorter production cycles and substantially minimised waste. Automation technologies strongly support these approaches with robot technology in particular being used to achieve these aims.

Collaboration between Robots and Humans
Today’s pharmaceutical and medical technology production environments see robots and human operators increasingly working side by side. Mitsubishi Electric’s integrated Robot Safety Solution helps manufacturers to boost productivity and lift human-machine collaboration by allowing the robot to continue operation, within tight constraints, while operators access its work cell. Safety sensing technology detects movements in two predefined zones within the operating environment of the manufacturing cell and transmits the information to the SafePlus safety system. A reduced operating speed or a movement stop is then assigned to the robot in real time, thus enabling operators to work in close proximity to the moving robot without a safety cage. As a result, humans and robots are able to work within an environment where the risk of danger is eliminated.

Robot-assisted handling solutions: compact, flexible and quick
Space is an expensive commodity, especially in cleanrooms. The manufacture and maintenance of these plants where an extremely high level of hygiene is required are extremely costly. Compact components are all the more important as ultimately, the machine needs to be space-saving. Mitsubishi Electric components such as SCARA and articulated arm robots, controllers and servo drives are characterised by their particularly space-saving design and are suitable for flexible applications even when space is restricted. Easy handling enables fast integration, commissioning and adjustment.

One example of a highly compact handling solution came from Robotronic AG. Their required a secondary packaging solution for supplying and packing filled vials of various sizes. The solution needed to be integrated in an existing system with limited available space. The modular design principle of the modular robot technology (MRT) produced by Robotronic provides excellent design flexibility. As a result, the basic module for the MRT cell has a footprint of just 1.0 x 1.30 metres and is approximately 2.20 metres high, so it also meets the minimal space requirements. The solution for the cleanroom class in accordance with GMP standard level D consists of two MRT cells, each with a compact robot from Mitsubishi Electric and a conveyor line with eight positioning screws, driven by Mitsubishi Electric servo motors. The robots place the vials in the blister packs at a processing speed of 300 units per minute.

Hygiene in cleanroom systems
The increasing use of automation technologies, especially robots, has led to an increase in the demand for systems which meet high cleanroom requirements. It is also just as important to be able to clean a plant before a production changeover without any major costs being incurred. That means that it must be possible to clean the components in place (i.e. be CIP-compatible) using aggressive chemicals like H2O2. For that reason, Mitsubishi Electric also offers its customers multi-resistant versions of its new generation of MELFA robots which have been approved for regular CIP cleaning using H2O2. MELFA robots can even meet cleanroom class requirements; ISO 3, are dust proof, and have IP67 environmental protection.

@MitsubishiFAEU #PAuto


Future factory – a moderator’s impression!

01/02/2016

Read-out was asked to moderate the automation stream at the National Manufacturing & Supplies conference held last week outside Dublin. (26th January 2016). In their wisdom the organisers selected “Future Factory!” as a title for this half day seminar and there were 11 speakers organised to speak on their particular subjects for about 15 minutes each. This was replicated in the the over a dozen different seminars held on this one day.

q#MSC16

Long queues lasted well into the morning to enter the event!

We were a little sceptical that this would work but with the help of the organisers and the discipline of the speakers the time targets were achieved. Another target achieved was the number of attendees at the event as well as those who attended this particular seminar.
In all between exhibitors, speakers and visitors well over 3000 packed the venue. Probably far more than the organisers had anticipated and hopefully a potent sign that the economy is again on the upturn. Indeed it was so successful that it was trending (#MSC16) on twitter for most of the day.

Seminar
But back to our seminar. If you google the term Future Factory you get back 207million links, yet it is difficult to find a simple definition as to what it means. The term automation similarly is a very difficult term to define though the term in Irish “uathoibriú” perhaps is a bit clearer literally meaning “self-working.”

uturefactory.jpg

Good attendance at the Seminar

Background
The world of automation has changed to an extrordinary degree and yet in other ways it remains the same. The areas where it has experienced least change is in the areas of sensing – a thermometer is a thermometer – and final control – a valve is a valve. Where it has changed almost to the point of unrecognisability is in that bit in the middle, what one does with the signal from the sensor to activate the final control element.

From single parameter dedicated Indicator/Controller/Recorders in the sixties which transmitted either pnuematically (3-15psi) or electrically (4-20mA). Gradually (relatively speaking) most instruments became electronic, smaller in size and multifunctional. The means of communication changed too and fieldbus communication became more common to intercact with computors which themselves were developing at breaknech speed. Then transmission via wireless became more common and finally the internet and the ability to control a process from the computer that we call the intelligent phone. There are problems with these latter, internet/cellphone, of course. One is that the reach of the internet is focussed at present on areas of high population. Another is the danger of infiltration of systems by hostile or mischivous strangers. The importance of security protocols is one that has only recently been apparent to Automation professionals.

• Many of the presentations are available on-line here. The password is manufac2016

The Presentations
Maria Archer of Ericsson spoke on the enabling and facilitating IoT in the manufacturing industry. Diving straight into topic she drew on her experience of big data, e-commerce, media, cyber security, IOT and connected devices.

The second speaker was Cormac Garvey of Hal Software who addressed Supply Chain prototyping. The Supply Chain ecosystem is incredibly complex, usually requiring significant integration of each suppliers’ standards and processes to the manufacturer’s. Cormac will introduce the concept of supply chain prototyping, where easy-to-use, standards-based technology is used to wireframe out the entire supply chain ecosystem prior to integration, thus significantly reducing cost, time and risk on the project. This wireframe can then be used as a model for future integration projects.

Two speakers from the Tralee Institute of Technology, Dr. Pat Doody and Dr. Daniel Riordan spoke on RFID, IoT, Sensor & Process Automation for Industry 4.0. They explained how IMaR’s (Intelligent Mechatronics and RFID) expertise is delivering for their industrial partners and is available to those aiming to become a part of Industry 4.0.

Smart Manufacturing – the power of actionable data was the topic addressed by Mark Higgins of Fast Technology. He shared his understanding of the acute issues companies face on their journey to Business Excellence and how leveraging IT solutions can elevate the business to a new point on that journey.

Assistant Professor (Mechanical & Manuf. Eng) at TCD, Dr Garret O’Donnell,   explained how one of the most significant initiatives in the last 2 years has been the concept of the 4th industrial revolution promoted by the National Academy for Science and Engineering in Germany- ACATECH, known as Industrie 4.0. (Industrie 4.0 was first used as a term in Germany in 2011).

Another speaker from Fast Technologies, Joe Gallaher, addressed the area of Robotics and how Collaborative Robots are the “Game Changer” in the modern manufacturing facility.

Dr. Hassan Kaghazchi of the University of Limerick and Profibus spoke on PROFINET and Industrie 4.0. Industrial communications systems play a major role in today’s manufacturing systems. The ability to provide connectivity, handle large amount of data, uptime, open standards, safety, and security are the major deciding factors. This presentation shows how PROFINET fits into Industrial Internet of Things (Industrie 4.0).

White Andreetto

Maurice Buckley CEO NSAI

The CEO of NSAI, the Irish National Standards Authority, Maurice Buckley explained how standards and the National Standards Authority of Ireland can help Irish businesses take advantage of the fourth industrial revolution and become more prepared to reap the rewards digitisation can bring.

The next two speakers stressed the impact of low forecast accuracy on the bottom line and how this coulbe be addressed. Jaap Piersma a consultant with SAS UK & Ireland explained that low forecast accuracies on the business performance is high in industry but with the right tools, the right approach and experienced resources you can achieve very significant result and benefits for your business. Following him Dave Clarke, Chief Data Scientist at Asystec, who mantains the company strategy for big data analytics service development for customers. He showed how are incredible business opportunities possible by harnessing the massive data sets generated in the machine to machine and person to machine hyper connected IoT world.

The final speaker David Goodstein, Connected Living Project Director, GSMA, described new form factor mobile SIMs which are robust, remotely manageable which are an essential enabler for applications and services in the connected world.

All in all a very interesting event and useful to attendees. Papers are being collected and should be available shortly on-line.

It is hoped to do it all again next year on 24th January 2017- #MSC17.

See you there.

@NationalMSC #MSC16 #PAuto #IoT