Machine Vision News

Image credit ipopba - AdobeStock

Wood, NERA Team Up for AI-Powered Subsea Inspections

has the potential to revolutionize the way asset inspections are conducted in challenging and high-risk environments across a range of industries while increasing the speed and accuracy of issue detection, response, and resolution.According to a statement released by NERA, Wood’s Augmented Machine Vision Solution provides a real-time inspection device capable of autonomously detecting and categorizing equipment anomalies."The solution will minimize safety risks, enhance asset integrity and create potential savings of AUD $2.8 billion per year for the Australian offshore energy industry alone

Photo: Xsens

Robotics Community Gains Access to Xsens IMUs

of IMU options and to specify the best MTi-series product for their application.  Clearpath Robotics already provides a component sales service for low-volume customers such as university researchers and development engineers, stocking and shipping products such as LiDARs, collaborative robots, machine vision cameras and force sensors. Now these customers will also be able to buy MTi-series IMUs from stock, for immediate shipment in any order quantity.Shahab Khokhar, Business Manager Components at Clearpath Robotics, said: ‘Xsens’ MTi-series IMUs are an attractive new option for our customers

The Riptide AUV (Credit BAE Systems)

Good Undersea Vehicles Come in Small Packages

has recently come on scene is the RangerBot. This vehicle takes a different approach to delivering an affordable solution for end users. The RangerBot was designed by engineers at Queensland University of Technology (QUT) in Brisbane, Australia. The team at QUT work in a robotics center focused on machine vision. They also work to support environmental assessments on the Great Barrier Reef. A key paradigm shift was enabled when the team, recognizing that water conditions in their target environment were very clear, chose to employ exclusively vision-based sensing.RangerBot over a reef (Credit Matthew

New model tool: the iCon inspection robot searches for cracks. CREDIT: OceanTech

Subsea Robots in the Splash Zone

Deepwater Inspection Tool into a splash zone tool able to effect repairs below the waterline. It’s crack-finding sensor — originally a handheld tool for divers — is affixed a probe carried by the robot and allowing it to trace subsea structures. It’s delivered by VAT. Using machine vision and automated thrusts in six directions, the tool follows a weld right around a structure, staying just millimeters form item being studied.  When a crack is detected, its length and the depth are seen topsides and another splash zone tool can be deployed. For deeper water, an ROV will be

(Photo: i-Tech 7)

i-Tech 7 Performs UWILD for Borr Drilling

to be installed on the Paragon MSS-1, along with two ROV personnel and one inspector to complete the operations.i-Tech 7 said its portfolio of vessel-based UWILD solutions can carry out inspections on mobile drilling units (MODUs) and floating production, storage and offloading units (FPSO), using ROV machine-vision technology combined with fast acquisition, processing and reporting of data.Robin Mawhinney, i-Tech 7’s regional director for Europe, Africa and Canada, said, “This is another great project example that showcases the strength and depth of our UWILD service offering.“Our

Photo: Oceaneering International

MTR100: #4 Martin McDonald , SVP, Oceaneering

tasks, sensors, intervention tooling, and system diagnostics, which leads to improved performance and efficiency gains. Continued software and control systems development are key components to enable subsea residency and autonomous interventions.”But he noted that machine learning and machine vision are important, too. “We have been working on automated operations, such as auto docking, where the ROV pilot can direct the ROV to move autonomously to a docking point by moving a cursor on the screen, without any intervention on the joystick. It is the machine vision recognition software

Martin McDonald, Senior Vice President, ROV Division, Oceaneering International.
Courtesy of Oceaneering International

One-on-One with Martin McDonald, SVP, ROV Division, Oceaneering

, manipulator tasks, sensors, intervention tooling, and system diagnostics, which leads to improved performance and efficiency gains. Continued software and control systems development are key components to enable subsea residency and autonomous interventions.There’s also machine learning and machine vision. As I mentioned earlier, we have been working on automated operations, such as auto docking, where the ROV pilot can direct the ROV to move autonomously to a docking point by moving a cursor on the screen, without any intervention on the joystick. It is the machine vision recognition software

MTR does not present an “MTR100 Creative Photo” award, but if we did this year’s winner is Houston Mechatronics. Pictured is Houston Mechatronic’s Aquanaut in wet testing earlier this year holding it’s MTR100 ‘trophy’. (Photo: Houston Mechatronics)

MTR100: The Ones to Watch

digitization. The company currently has 12 full-time engineers with experience spanning from mechanical engineering and computer vision and machine learning to electronic and robotics engineering.Forrsea is designing, building and qualifying a new autonomous ROV called ATOLL. It is designed to use machine vision technology to dock on subsea infrastructure. Because vision-based technology is critical to the autonomous docking of ATOLL, Forssea has developed a range of vision-based products, ranging from low-cost Ethernet cameras, adapted to augmented reality applications, to an enhanced vision product

Kongsberg’s Yara Birkeland unmanned container ship concept. (Image: Kongsberg)

Ocean Autonomy: Norway to the Fore

, microe-electric-mechanical systems, and big data. As an example, systems are being developed which could sense and distributed forces along the body of an underwater vehicle, in order to compensate for or reduce drag, says Sørensen. He also cites micro to macro actuation and sensing, and machine vision systems using hyperspectral sensing which can take in any wavelength to classify and detect things that we haven’t been able to before. The possibilities are vast.Sørensen also sees a “democratization” of this space. With cheaper satellites and commercial underwater drones

ASV Global SIMVEE Project Photo ASV Global

ASV Global, BMT Team-up for Autonomous Navigation Project

 ASV Global (ASV) is leading a new £1.2 million research project in partnership with BMT to enhance the safety and reliability of autonomous navigation. The project team will use deep learning machine vision systems trained with a combination of simulated and real world data. Part funded by Innovate UK, the UK’s innovation agency, this project will enhance situational awareness enabling the USV to operate in extreme and congested marine environments.   The Synthetic Imagery training for Machine Vision in Extreme Environments (SIMVEE) project will build upon ASV’s existing

Marine Technology Magazine Cover May 2020 -

Marine Technology Reporter is the world's largest audited subsea industry publication serving the offshore energy, subsea defense and scientific communities.

Subscribe
Marine Technology ENews subscription

Marine Technology ENews is the subsea industry's largest circulation and most authoritative ENews Service, delivered to your Email three times per week

Subscribe for MTR E-news