From AI and 3D to sensor integration, the latest developments in machine vision are resulting in ever-more precise, cost-efficient, faster and higher-resolution image capture. Here four experts discuss how advancements in the technology are impacting sectors as diverse as warehousing, mobility, offshore and banking…
Alex MacPherson, Director of Solution Consulting and Account Management, Manhattan Associates
It’s no surprise that robotics is a diverse industry, as are the sectors that use it, and implementing robotics in supply chains involves significant upside. To succeed, firms must have scalability and agility in both their hardware and software. The core capability of AI is that it allows robots a limited level of autonomy, which in turn allows them to work in unstructured environments. This is essentially what happens when a robot is picking items in a warehouse.
However, the greatest benefits can be achieved by combining robot platforms with a warehouse management system. We expect that different robotic platforms will collaborate as robotic systems become more sophisticated. For example, pallet-moving robots could assist in the point-to-point movement of heavy cargo, while arm-like robots could sort items and, eventually, pick them for customer orders. To complement an organisation’s workflow, communication between and interaction among systems will be critical.
Additionally, as the demand for collaborative robots increases, AI will remain an essential part of the software ecosystem that is the future of supply-chain technology, and AI will continue to be a vital part of the ecosystem as collaborative robots become more prevalent. There are so many moving parts – both literally and metaphorically – in its future, making it difficult to gauge exactly what that will look like. However, there is one thing for certain; robotic advances will explode in the supply chain space in the next decade. Thus, its key businesses are working with the right hardware and software partners to ensure they take advantage of these crucial developments.
Brian Allen, CEO, Vaarst
The adoption of innovative technologies like robotics in the offshore industries is by no means a new thing – we’ve been using remotely operated underwater vehicle (ROVs) for decades. But uptake of more recent innovations, such as simultaneous localisation and mapping, machine learning, and increasing levels of autonomy, has been slower than expected given the opportunities they enable. By deploying the latest advances in machine-vision technology, companies operating in the marine and energy industries can see significant process improvements, dramatically reducing the costs of conducting subsea surveys, while making the work safer, and much more sustainable.With machine-vision systems, energy companies can take precise measurements of their subsea assets and their environment – identifying even the smallest defects in chains holding structures in place, and spotting damage on pipelines and cables. Meanwhile the same system will soon be driving these vehicles entirely autonomously. ROVs and AUVs equipped with these systems can take these measurements far quicker than alternative technologies like lidar or acoustic sensors.
Also, because they can be supervised remotely from onshore locations, they not only remove the need for divers to be placed in hazardous environments, they reduce the need to have people at sea, thereby enabling the adoption of autonomous ships and reducing the need for large crews and survey vessels, which cost tens of thousands of pounds a day to charter and burn large quantities of some of the dirtiest fossil fuels in use today. This is a game changer for the industry and the energy transition. Accelerating the adoption of advanced machine-vision technologies has the potential to transform the way companies operating marine assets build, monitor, maintain, and decommission their infrastructure. As we build out more offshore wind capacity in efforts to realise a net zero economy, these process and technology improvements will play a very significant role.
Gideon Richheimer, CEO and Co-Founder, AutoFill Technologies
The inspection of trains and broader infrastructure that makes up a rail network is vital for optimising safety, keeping maintenance costs low, and ensuring services run on time. But currently most inspections in these sectors are carried out manually, and are therefore subject to error. Multi-dimensional sensors, using AI and machine learning, can capture and process massive amounts of visual data synchronously at high frame rates, extract relevant information from the processed images and report any anomaly detected in real-time.
The technology can significantly reduce the total time for inspections, as programmes can be set up to continuously run over periods of time. And, because it is completely evidence-based, it adds objectivity to assessments and analysis, resulting in operational consistency and increased accuracy.
Safety will also be heightened given the always-on nature of the systems, and huge pools of data mean systems are constantly learning, growing, and becoming even more intelligent to predict when and where anomalies or disruptions may occur. When it comes to fleets – over the next few decades, I predict that people living in urban centres will increasingly move away from owning cars. Instead, we will simply “borrow them”, either through ride-sharing apps like Uber and Lyft, or from public transport authorities and local councils. With this, consumers will no longer be directly responsible for maintaining safety and quality – this will be up to the organisation or authority running the fleet. For them, it is imperative they have access to technology that supports a convenient mobility experience. This requires better object inspection and maintenance, of which AI has an enormous potential to optimise. In conclusion, automated vehicle inspection systems, underpinned by computer-vision and machine-learning software, will be vital for ensuring the quality and safety levels of large vehicle fleets and rail infrastructure. What’s more, a reliable transport network stimulates use of public transport, supporting ambitions to cut carbon emissions.
Jai Ganesh, Senior Vice President and Head of Research & Innovation, Mphasis
The pandemic was a catalyst for accelerated digitalisation as we sought new ways to ensure safety and wellbeing, maintain business continuity and productivity. This is where I see tremendous potential for advanced machine-vision technology – by leveraging the automation capabilities afforded by AI and machine learning, it is possible to ensure safety measures and streamline manual processes without human intervention. During the height of the pandemic, computer-vision techniques aided government guidelines to minimise the spread of Covid-19 through no touch biometric systems, and monitoring adherence to mask wearing and social distancing guidance.Apart from this technology’s applicability to healthcare, we are seeing its implications in other sectors, with increasing use cases not only across industrial processes in manufacturing and automotive, for instance, but also traditional financial services. In banking, evolving automation and machine-vision technology can offset fraud and other risks, safeguard personal data and open avenues to offering better and more innovative products and services.
Computer-vision technology has made tremendous progress in its ability to process complex data sets in real time – including digital images and video, which are converted into computer understandable format. This makes it possible to extract relevant and useful information and conduct advanced analytics aligned with specific criteria. Records can then be kept for statistical purpose and can be visualised over a dashboard for further analysis and to inform effective future decision-making. While deep-learning methods require high computation, the maturity and flexibility offered by multiple cloud service providers have made this task easier. As such, the full game-changing benefits of machine vision are yet to be seen.
This article originally appeared in the April 2022 issue of Robotics & Innovation Magazine