From factories to logistics centers, robots are starting to evolve from simply repeating commands to deciding the next step on their own. This shift is where the true ROI for managers will become apparent over the next few years.
From command recipient to decision-maker: thanks to Agentic AI, robots no longer just stubbornly execute pre-programmed scripts. Instead, they analyze visual data in real time and decide autonomously on the next process step
(Image: Nano Banana / AI-generated)
Over the past decade, collaborative robots (cobots) have been seen primarily as efficient, tireless helpers that perform strictly scripted, repetitive tasks, while the actual decision-making authority remains with humans. They weld along predefined seams, move pallets along fixed routes and stop as soon as a sensor detects something unexpected. While this model delivered efficiency, it also placed a fixed ceiling on what automation could achieve, as any exception still had to be delegated to a human.
Agentic AI is now beginning to break down this ceiling by giving robots limited autonomy over their own micro-decisions. Instead of asking, "What did the programmer specify?", these systems can ask, "What should happen next based on what I see?". They then act without stopping the line to wait for a (human) supervisor.
The result is not science fiction-like general intelligence, but a pragmatic leap forward: robots that are responsible for an entire workflow, not just a single movement in it.
How Robots Learn from Humans And Manuals
Two advances are making this change possible: learning from videos and learning from language.
Robots can now be trained via video by watching highly skilled operators perform tasks, with vision models translating human movements, tools and results into machine-understandable patterns. The robot does not simply replay a recorded trajectory. It learns how specific visual and physical conditions correlate with the correct next action.
Per language, Large Language Models (LLMs) and Vision Language Models (VLMs) process the same operating manuals and work instructions that technicians use and turn this dense documentation into operational "playbooks". Instead of a human reading a 200-page welding or casting manual and then translating it into parameters for a robot, the AI layer can consume this manual directly and derive rules such as acceptable tolerances, error taxonomies and escalation paths.
Combine these capabilities and you get robots that are anchored in both how people actually work and how the process should work on paper.
The Autonomous Inspection Control Loop
The first area where this new autonomy appears on a large scale is inspection. Inspection is data-intensive, safety-critical and historically under-automated, making it a perfect starting point for agentic behavior.
Robots can now be used for complex welding, casting and forging, for example:
Capture high-resolution visual and depth data on joints, surfaces and internal geometries.
Classify defects, such as porosity, cracks, burn marks, offsets or inclusions, against standards coded from manuals and previous human judgment.
Decide whether a particular deviation is acceptable, reworkable or scrap without a human having to check every frame.
The decisive factor is that today's systems can go one step further: They can close the loop by autonomously generating work orders and inserting them into repair queues. If a weld seam on an aircraft frame is outside the tolerance bands, the robot not only switches on a red warning light: it logs the type of defect, the location, the severity and the recommended corrective action and then creates a digital work order for the corresponding technician or the downstream robot cell.
This transforms inspection from a passive gatekeeper into an active orchestrator of rework, which shortens throughput times and makes quality data immediately usable. For manufacturers and logistics operators, this control loop translates into measurable results: higher first-time yield, less rework, better traceability and more stable schedules, as fewer surprises occur late in the process.
The perspective of the management level should be less "How many robots do we have?" but rather "How many closed control loops have we handed over to autonomous systems?".
What AI Can't Do Yet (And Why the "Human in the Loop" Remains)
Even though inspection is becoming increasingly autonomous, robots are not yet ready to take over the most complex process decisions. In highly complex welding, human experts still pick up on subtle cues, such as faint changes in arc noise, a slight discoloration, the feel of heat through the gloves, to decide in real time how to adjust technique, consumables and temperature.
These decisions are based on years of experience that has never been fully documented, let alone labeled on a large scale for training purposes. Current systems also struggle with novel scenarios, such as one-off repairs on unique equipment, improvised fixtures, or interpreting incomplete or inconsistent documentation. When operators quickly adapt to a slightly warped casting or a non-standard joint configuration, they combine formal rules with an intuition about risk, cost and downstream effects that today's models can only approximate.
The result is a clear short-term balance: robots will increasingly make decisions within well-defined domains, while humans continue to define the boundaries, handle edge cases and refine the playbooks. Managers should resist both extremes—both the hype that robots will immediately replace specialists and the skepticism that they will never be able to do what experts can.
The more realistic path is a progressive handover: first inspection autonomy, followed by autonomy in standardized rework procedures and only later in complex, craft-intensive operations once more multimodal data from expert services has been collected.
Date: 08.12.2025
Naturally, we always handle your personal data responsibly. Any personal data we receive from you is processed in accordance with applicable data protection legislation. For detailed information please see our privacy policy.
Consent to the use of data for promotional purposes
I hereby consent to Vogel Communications Group GmbH & Co. KG, Max-Planck-Str. 7-9, 97082 Würzburg including any affiliated companies according to §§ 15 et seq. AktG (hereafter: Vogel Communications Group) using my e-mail address to send editorial newsletters. A list of all affiliated companies can be found here
Newsletter content may include all products and services of any companies mentioned above, including for example specialist journals and books, events and fairs as well as event-related products and services, print and digital media offers and services such as additional (editorial) newsletters, raffles, lead campaigns, market research both online and offline, specialist webportals and e-learning offers. In case my personal telephone number has also been collected, it may be used for offers of aforementioned products, for services of the companies mentioned above, and market research purposes.
Additionally, my consent also includes the processing of my email address and telephone number for data matching for marketing purposes with select advertising partners such as LinkedIn, Google, and Meta. For this, Vogel Communications Group may transmit said data in hashed form to the advertising partners who then use said data to determine whether I am also a member of the mentioned advertising partner portals. Vogel Communications Group uses this feature for the purposes of re-targeting (up-selling, cross-selling, and customer loyalty), generating so-called look-alike audiences for acquisition of new customers, and as basis for exclusion for on-going advertising campaigns. Further information can be found in section “data matching for marketing purposes”.
In case I access protected data on Internet portals of Vogel Communications Group including any affiliated companies according to §§ 15 et seq. AktG, I need to provide further data in order to register for the access to such content. In return for this free access to editorial content, my data may be used in accordance with this consent for the purposes stated here. This does not apply to data matching for marketing purposes.
Right of revocation
I understand that I can revoke my consent at will. My revocation does not change the lawfulness of data processing that was conducted based on my consent leading up to my revocation. One option to declare my revocation is to use the contact form found at https://contact.vogel.de. In case I no longer wish to receive certain newsletters, I have subscribed to, I can also click on the unsubscribe link included at the end of a newsletter. Further information regarding my right of revocation and the implementation of it as well as the consequences of my revocation can be found in the data protection declaration, section editorial newsletter.
How Managers Realize the ROI of "Thinking" Robots
To benefit from agentic robots, leaders should view this change as a transformation of information and decision rights, not a hardware refresh. Three priorities stand out:
Building a robust digital backbone:Agent systems rely on consistent access to 3D models, historical quality data, manuals and work instructions. Fragmented or isolated data becomes the biggest brake on autonomy, not sensor performance.
Treat expert knowledge as a strategic asset:Systematically capture the decisions of experienced welders and inspectors in video and data form so that future models have a rich foundation ("ground truth") for learning, rather than relying solely on documentation that lags behind practice.
Redesign roles and KPIs: As robots take over more closed-loop operations, human work shifts to oversight, exception handling and continuous improvement; metrics should reward reduced deviations, faster recovery and quality stability, not just throughput.
A simple thought experiment for a plant manager illustrates the opportunity: imagine a repetitive activity that requires little judgment and where your best people say, "I know the answer the second I see it." These are prime candidates for agentic inspection and triage. Start there. Prove that a robot can account for the entire loop from observation to action, and then expand into more sophisticated tasks as the technology and your maturity with data evolve.
Managers who act early will not only own more robots. They will own more of the decision-making structure of their operations. In an era where resilience, quality and speed are strategic differentiators, shifting decisions from "repeat what you've been told" to "decide what needs to happen next" could be the most consequential automation upgrade of the next decade.
*About the author: Dijam Panigrahi is co-founder and COO of , a leading provider of cloud-based platforms that enable compelling, high-quality digital twin experiences on mobile devices for enterprises.(mc)