<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Robotics Update &#187; Robot programming</title>
	<atom:link href="https://www.roboticsupdate.com/category/technology/robot-programming/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.roboticsupdate.com</link>
	<description>The Online Magazine for Industrial Robots &#38; Automation</description>
	<lastBuildDate>Tue, 28 Apr 2026 08:50:16 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Bridging the virtual and real automation worlds</title>
		<link>https://www.roboticsupdate.com/2026/04/bridging-the-virtual-and-real-automation-worlds/</link>
		<comments>https://www.roboticsupdate.com/2026/04/bridging-the-virtual-and-real-automation-worlds/#comments</comments>
		<pubDate>Fri, 24 Apr 2026 09:52:11 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[Omron]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[Dassault Systemes]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10672</guid>
		<description><![CDATA[Dassault Systèmes and OMRON have announced their partnership to bridge the gap between information technology (IT) and operational technology (OT). This collaboration enables manufacturers and machine builders to design, simulate, and deploy smarter, more flexible, and higher-performing production systems through a unified approach that merges virtual and real environments. Today’s factories often face a critical issue: [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/04/260425_Omron.jpg"><img class="alignright size-medium wp-image-10673" src="https://www.roboticsupdate.com/wp-content/uploads/2026/04/260425_Omron-300x225.jpg" alt="260425_Omron" width="300" height="225" /></a>Dassault Systèmes and <a title="OMRON" href="http://industrial.omron.eu" target="_blank">OMRON</a> have announced their partnership to bridge the gap between information technology (IT) and operational technology (OT). This collaboration enables manufacturers and machine builders to design, simulate, and deploy smarter, more flexible, and higher-performing production systems through a unified approach that merges virtual and real environments.</p>
<p>Today’s factories often face a critical issue: product design, automation, and production systems operate in silos. This fragmentation leads to longer commissioning times, higher error risks, and limited flexibility. OMRON and Dassault Systèmes are breaking down these barriers by creating a seamless link between 3D design and simulation in the virtual world, and robots, sensors and production lines in the physical world.</p>
<p>The collaboration combines Dassault Systèmes’ 3D UNIV+RSES with OMRON’s Sysmac industrial automation platform, enabling manufacturers to design, simulate, validate and deploy production systems within a continuous virtual environment. At the core of the partnership is the Virtual Twin of Production Systems, which allows companies to test a new production line, validate robot behaviour, or optimise logistics flows – prior to building anything physically.</p>
<p>Thanks to this IT/OT convergence, manufacturers benefit from a digital continuum before deployment and during operations. Production lines are designed, simulated, and validated in a virtual environment augmented by Virtual Companions. Performance, safety, maintenance and other scenarios are tested to correct errors before real-world deployment. Once the physical line is installed, real-time data from sensors, controllers, and robots is fed back into the virtual twin. This enables comparison between real and simulated behaviour, fine-tuning, and predictive maintenance to reduce costs and risks.</p>
<p>“Manufacturing is entering a new era. With OMRON, we are building living production systems, AI-driven, self-improving, and software-defined, where the virtual and physical worlds are fused into one continuous loop of learning. Our industry world models transform complexity into intelligence, making factories not just automated, but autonomous. This is how we reinvent industrial systems, from reactive to predictive, from rigid to adaptive and define the next frontier of manufacturing,” said Pascal Daloz, CEO, Dassault Systèmes.</p>
<p>“Our partnership with Dassault Systèmes strengthens our ability to integrate the OT and IT worlds and provide customers with a holistic solution from simulated to fully implemented, intelligent production,” said Motohiro Yamanishi, Company President of the Industrial Automation Company (IAB), OMRON Corporation.</p>
<p>Visit the OMRON website for more information</p>
<p>See all stories for OMRON</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/04/bridging-the-virtual-and-real-automation-worlds/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>ABB Robotics to showcase next-gen automation</title>
		<link>https://www.roboticsupdate.com/2026/04/abb-robotics-to-showcase-next-gen-automation-at-mach/</link>
		<comments>https://www.roboticsupdate.com/2026/04/abb-robotics-to-showcase-next-gen-automation-at-mach/#comments</comments>
		<pubDate>Tue, 07 Apr 2026 09:06:32 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[ABB Robotics]]></category>
		<category><![CDATA[All News]]></category>
		<category><![CDATA[Articulated Arm]]></category>
		<category><![CDATA[Collaborative robots]]></category>
		<category><![CDATA[Control]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[ABB]]></category>
		<category><![CDATA[MACH]]></category>
		<category><![CDATA[robotics]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10624</guid>
		<description><![CDATA[ABB Robotics will be demonstrating a range of advanced automation technologies at MACH 2026, highlighting how manufacturers can boost productivity, improve flexibility and accelerate digital transformation through more autonomous and versatile robotics (AVR). On stand 18-640 in Hall 18, visitors will experience the latest developments in collaborative robots, industrial automation cells, digital engineering tools and [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/04/260407_ABB.jpg"><img class="alignright size-medium wp-image-10625" src="https://www.roboticsupdate.com/wp-content/uploads/2026/04/260407_ABB-300x225.jpg" alt="260407_ABB" width="300" height="225" /></a><a title="ABB Robotics" href="https://www.abb.com/robotics" target="_blank">ABB Robotics</a> will be demonstrating a range of advanced automation technologies at MACH 2026, highlighting how manufacturers can boost productivity, improve flexibility and accelerate digital transformation through more autonomous and versatile robotics (AVR).</p>
<p>On stand 18-640 in Hall 18, visitors will experience the latest developments in collaborative robots, industrial automation cells, digital engineering tools and lifecycle service and support offerings for metals fabrication applications.</p>
<p>“Demand for automation in metal fabrication is accelerating, especially for welding applications where 29 percent of manufacturers have identified automation as a major priority to help them meet a shortfall in skilled manual workers,” said Alan Conn, Managing Director at ABB Robotics UK &amp; Ireland. “This trend reflects a growing need for the industry to embrace robotic automation to improve productivity and address ongoing skilled labour shortages.”</p>
<p>Exhibits will include examples of ABB’s OmniVance collaborative Arc Welding and Machine Tending application cells. The arc welding cell will show how welding operations can be automated quickly and easily using a collaborative robot. Designed for high-mix, low-volume production environments, the cell enables operators to program welding sequences in minutes using an intuitive Easy Teach Device, reducing programming time and making robotic welding accessible even to first-time users.</p>
<p>The collaborative machine tending solution will demonstrate how cobots can automate repetitive loading and unloading tasks. Based on standardized hardware and intuitive interfaces, the cell simulates a typical machine tending process, enabling operators to manage production through a simple teach-and-run approach.</p>
<p>Alongside the OmniVance cells will be ABB Robotics’ FlexLoader FP800 high-performance robotic cell, highlighting how manufacturers can automate complex material handling tasks while maintaining flexibility and high productivity. Using advanced 3D vision technology, the cell demonstrates a semi-structured bin picking application, showing robots identifying and picking randomly oriented components.</p>
<p>The stand will also showcase ABB Robotics’ digital capabilities, with demonstrations of RobotStudio, the industry-leading offline simulation and programming software tool. RobotStudio enables manufacturers to design, program and optimise robotic systems in a virtual environment before deployment, reducing commissioning time and improving production efficiency.</p>
<p>Supplementing these demonstrations will be a focus on ABB Robotics’ Modernisation Services showing how ABB Robotics can help manufacturers to combine robotics, digital engineering tools and lifecycle services, including upgrades and optimisation, to build smarter, more flexible factories and remain competitive in an increasingly automated manufacturing landscape.</p>
<p>Visit the ABB Robotics website for more information</p>
<p>See all stories for ABB Robotics</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/04/abb-robotics-to-showcase-next-gen-automation-at-mach/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Webinar explores physical AI in robotics</title>
		<link>https://www.roboticsupdate.com/2026/03/webinar-explores-physical-ai-in-robotics/</link>
		<comments>https://www.roboticsupdate.com/2026/03/webinar-explores-physical-ai-in-robotics/#comments</comments>
		<pubDate>Tue, 24 Mar 2026 09:14:12 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[Robotiq]]></category>
		<category><![CDATA[Sensors]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[physical]]></category>
		<category><![CDATA[Universal Robots]]></category>
		<category><![CDATA[webinar]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10568</guid>
		<description><![CDATA[Join Robotiq and Universal Robots to explore insights from recent AI conferences and the launch of new products from the companies, in a webinar on 25 March. Recent industry conferences and technology showcases have highlighted major advancements shaping the future of AI and robotics. This year, Physical AI is taking centre stage – with rapid [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/03/Robotiq.jpg"><img class="alignright size-medium wp-image-10569" src="https://www.roboticsupdate.com/wp-content/uploads/2026/03/Robotiq-300x179.jpg" alt="Robotiq" width="300" height="179" /></a>Join Robotiq and Universal Robots to explore insights from recent AI conferences and the launch of new products from the companies, <a title="Physical AI in robotics webinar" href="https://www.universal-robots.com/learn-and-connect/events/online-events/physical-ai-key-breakthroughs-shaping-the-future-of-robotics/" target="_blank">in a webinar on 25 March</a>.</p>
<p>Recent industry conferences and technology showcases have highlighted major advancements shaping the future of AI and robotics. This year, Physical AI is taking centre stage – with rapid advances in foundation models, simulation-to-real workflows, and increasingly capable robotic systems.</p>
<p>In this webinar, experts from Robotiq and Universal Robots will share key insights from these events and highlight the Physical AI applications and trends that robotics teams should be watching closely.</p>
<p>The webinar will break down the top five Physical AI breakthroughs demonstrated across recent industry gatherings, discuss where AI still faces challenges when interacting with the physical world, and explore why richer multimodal sensing, especially touch, is becoming critical for training and deploying intelligent robots.</p>
<p>During the session, Robotiq will also introduce its new TSF-85 tactile sensor fingertips, designed to bring pressure, vibration, and proprioception sensing directly to robot grippers. You’ll see how tactile sensing can help generate richer datasets, improve grasp reliability, and support the next generation of Physical AI systems.</p>
<p><a title="Physical AI webinar" href="https://www.universal-robots.com/learn-and-connect/events/online-events/physical-ai-key-breakthroughs-shaping-the-future-of-robotics/" target="_blank">Click here to register for the webinar</a></p>
<p>Visit the Robotiq website for more information</p>
<p>See all stories for Robotiq</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/03/webinar-explores-physical-ai-in-robotics/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>RARUK Automation to offer Acteris AI robot software</title>
		<link>https://www.roboticsupdate.com/2026/03/raruk-automation-to-offer-acteris-ai-robot-software/</link>
		<comments>https://www.roboticsupdate.com/2026/03/raruk-automation-to-offer-acteris-ai-robot-software/#comments</comments>
		<pubDate>Tue, 24 Mar 2026 08:51:06 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[RARUK Automation]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[Acteris]]></category>
		<category><![CDATA[AI platform]]></category>
		<category><![CDATA[T-Robotics]]></category>
		<category><![CDATA[Trener Robotics]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10565</guid>
		<description><![CDATA[Automation and robotics distributor RARUK Automation has signed an exclusive UK distribution agreement with Trener Robotics (formerly T-Robotics). Headquartered in San Jose, California, and Trondheim, Norway, Trener Robotics offers an AI platform &#8211; Acteris &#8211; designed to bring artificial intelligence to manufacturing robots. Instead of the traditional code-based programming required to get a robot cell [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260324_RAR.jpg"><img class="alignright size-medium wp-image-10566" src="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260324_RAR-300x225.jpg" alt="260324_RAR" width="300" height="225" /></a>Automation and robotics distributor <a title="RARUK Automation" href="https://www.rarukautomation.com/" target="_blank">RARUK Automation</a> has signed an exclusive UK distribution agreement with Trener Robotics (formerly T-Robotics). Headquartered in San Jose, California, and Trondheim, Norway, Trener Robotics offers an AI platform &#8211; Acteris &#8211; designed to bring artificial intelligence to manufacturing robots.</p>
<p>Instead of the traditional code-based programming required to get a robot cell up and running, Trener Robotics’ Acteris platform offers a learning-based software solution powered by Vision-Language-Action (VLA) models. This enables greater adaptability to changing production needs and makes automation increasingly accessible for those with no prior experience. Acteris can be deployed on new robot cells or retrofitted on existing work cells.</p>
<p>RARUK Automation is the UK’s leading Universal Robots distributor, offering a portfolio of complementary products including application kits, end-of-arm tooling (EOAT) and vision systems to automate repetitive manufacturing tasks. Combining Acteris with the popular Universal Robots range will unlock new automation possibilities for RARUK Automation’s UK customers.</p>
<p>“We’re pleased to announce our new partnership with Trener Robotics. Their innovative approach to Physical AI, and replacing point-to-point programming with adaptive intelligence, aligns perfectly with RARUK Automation&#8217;s mission to make automation accessible, intuitive and commercially beneficial for manufacturers of all sizes,” said Ross Lacy, Sales Director at RARUK Automation.</p>
<p>Acteris enables robots to adapt to real-world environments where variation and complex operating conditions are common. This allows manufacturers to move from traditional programming to intuitive and scalable automation.</p>
<p>“Partnering with RARUK Automation is a strategic step in our mission to bring Physical AI to global manufacturing. RARUK has a strong reputation for making automation accessible, and by combining their automation expertise with the Acteris platform, we are giving UK manufacturers the ability to deploy robots that see, learn, and adapt in real time. We are moving away from the era of static programming and into a future where intelligent robotics is the standard on the factory floor,” said Jacob Pascual Pape, VP of Sales at Trener Robotics.</p>
<p>Visit the RARUK Automation website for more information</p>
<p>See all stories for RARUK Automation</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/03/raruk-automation-to-offer-acteris-ai-robot-software/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>UR and Scale AI launch imitation learning system</title>
		<link>https://www.roboticsupdate.com/2026/03/ur-and-scale-ai-launch-imitation-learning-system/</link>
		<comments>https://www.roboticsupdate.com/2026/03/ur-and-scale-ai-launch-imitation-learning-system/#comments</comments>
		<pubDate>Mon, 23 Mar 2026 10:23:03 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[Collaborative robots]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[Universal Robots]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[learning system]]></category>
		<category><![CDATA[Scale AI]]></category>
		<category><![CDATA[trainer]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10558</guid>
		<description><![CDATA[Universal Robots (UR) unveiled the UR AI Trainer at GTC 2026 in Silicon Valley. Developed in collaboration with Scale AI, the AI Trainer marks a tectonic shift as robots move from pre-programmed applications to fully AI-driven tasks. These systems are powered by robust data generated in AI training cells where robots imitate humans. “Our customers, ranging from [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260323_UR.jpg"><img class="alignright size-medium wp-image-10559" src="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260323_UR-300x225.jpg" alt="260323_UR" width="300" height="225" /></a><a title="Universal Robots" href="https://www.universal-robots.com" target="_blank">Universal Robots</a> (UR) unveiled the UR AI Trainer at GTC 2026 in Silicon Valley. Developed in collaboration with Scale AI, the AI Trainer marks a tectonic shift as robots move from pre-programmed applications to fully AI-driven tasks. These systems are powered by robust data generated in AI training cells where robots imitate humans.</p>
<p>“Our customers, ranging from large enterprises to AI research labs, are no longer just asking for AI features,” said Anders Beck, VP of AI Robotics Products at Universal Robots. “They need a way to collect high-fidelity, synchronised robot and vision data to train AI models on the same robots they intend to deploy. Our AI Trainer is the industry’s first direct lab-to-factory solution for AI model training.”</p>
<p>Alongside the new AI Trainer, Universal Robots’ GTC booth showcased a state-of-the-art robotic foundation model from Generalist AI, a UR preferred model partner. Leveraging this model, two UR robots completed a complex smartphone packaging task, previously impossible without recent advances in the field of Physical AI.</p>
<h4>Enabling AI-ready data capture</h4>
<p>AI robotics training is often hindered by fragmented hardware and low-fidelity data capture.  Much of today’s training data is collected on research robots not suited for production environments, and many systems rely only on visual feedback, making delicate or contact-rich tasks difficult.</p>
<p>“The AI Trainer directly addresses these barriers,” said Beck. “By utilising our unique Direct Torque Control and force feedback features, we give developers direct influence over how the robot physically interacts with the world, training on the same robust hardware used in over 100,000 industrial deployments.”</p>
<h4>A flywheel of integrated robotics data</h4>
<p>The AI Trainer allows human operators to guide UR robots through tasks in a leader-follower setup while automatically capturing high-quality multimodal data for robotics AI development. Operators physically guide a “leader” robot through a task while a synchronised “follower” robot mirrors the motion in real time. During each demonstration, the system records synchronised motion, force and visual data, producing the structured datasets required to train Vision-Language-Action (VLA).</p>
<p>Deploying on UR’s AI Accelerator platform, the UR AI Trainer combines UR robots with Scale AI software to enable data capture on UR robots in production and at scale creating continuous feedback that drives ongoing optimisation of physical AI systems.</p>
<p>“Universal Robots is a leader in industrial robotics, and its global footprint offers the ideal foundation for data capture and AI deployment foundation modelling at scale,” said Ben Levin, General Manager, Physical AI at Scale AI.</p>
<p>“Together, we’ve created an integrated robotics data flywheel, allowing customers to train, deploy, and improve their AI models faster than ever before.” As part of this collaboration, UR and Scale AI will release a large-scale industrial task dataset collected on UR robots later this year.</p>
<h4>First-hand encounters with AI Trainer at  GTC</h4>
<p>With GTC as the official launch pad, attendees were able to experience the system first-hand at UR’s booth as they guided two UR3e ‘leader’ robots providing haptic input to control two UR7e ‘follower’ robots. The setup enabled visitors to perform an advanced smartphone packaging task with haptic feedback for imitation learning and VLA training, with demonstration data recorded in real time on Scale’s stack and replayable directly on the AI Trainer.</p>
<p>The process of capturing robot training data for AI models was further showcased through a demo that illustrates the same smartphone packaging task - just trained virtually: Built-in NVIDIA Omniverse and leveraging Isaac Sim, the simulated setup allowed attendees to control a virtual bi-manual UR3e system with real-time haptic feedback using two Haply Inverse3 devices as ‘leaders’, providing a physics-accurate simulation.</p>
<p>Universal Robots is also exploring the use of the NVIDIA Physical AI Data Factory Blueprint to automate and scale its synthetic data generation, transforming world-scale compute into a production engine for high-quality robotic training data.</p>
<p>“The shift toward Physical AI requires a fundamental move from rigid, pre-programmed automation to generalist robots that can perceive, reason and learn through human-like interaction,” said Amit Goel, head of robotics and edge AI ecosystem at NVIDIA.</p>
<p>“By leveraging the NVIDIA Isaac simulation frameworks, Universal Robots is building a scalable engine for high-fidelity data capture and generation, providing the essential infrastructure to train the next generation of autonomous systems at scale.”</p>
<h4>Real-world robotic foundation model performance</h4>
<p>Complementing the two data-capture demonstrations, Generalist’s showcase highlighted how advances in data collection and AI models translate into real-world robotic performance. In the first public demonstration of Generalist’s embodied foundation models, two UR7e robots autonomously executed a complex smartphone packaging task, demonstrating dexterity, coordination and contact-rich manipulation in a real-world environment. The demonstration showed how scaled, high-quality training data combined with frontier model architectures can enable robust physical AI systems beyond the lab.</p>
<p>“Generalist is building embodied foundation models that deliver industry-leading dexterity and reliability,” said Pete Florence, co-founder and CEO of Generalist AI. “This demonstration on Universal Robots’ trusted industrial platform shows how physical common sense can be translated into real-world capability, paving the way for deployment across industries at scale.”</p>
<p>“The adoption of our technology by the pioneers of AI model training and data capture underscores why Universal Robots has become the preferred platform for physical AI,” said Beck.</p>
<p>Visit the Universal Robots website for more information</p>
<p>See all stories for Universal Robots</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/03/ur-and-scale-ai-launch-imitation-learning-system/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>KUKA to showcase latest robots and software at MACH</title>
		<link>https://www.roboticsupdate.com/2026/03/kuka-to-showcase-latest-robots-and-software-at-mach/</link>
		<comments>https://www.roboticsupdate.com/2026/03/kuka-to-showcase-latest-robots-and-software-at-mach/#comments</comments>
		<pubDate>Thu, 12 Mar 2026 09:27:45 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[Articulated Arm]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[KUKA]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[customer portal]]></category>
		<category><![CDATA[iiQKA.OS2]]></category>
		<category><![CDATA[industrial robot]]></category>
		<category><![CDATA[Kuka]]></category>
		<category><![CDATA[MACH]]></category>
		<category><![CDATA[my.KUKA]]></category>
		<category><![CDATA[operating system]]></category>
		<category><![CDATA[robot]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10532</guid>
		<description><![CDATA[KUKA Robotic is set to make a major impact at MACH 2026, the UK&#8217;s largest event for the manufacturing industry, with its cutting-edge robotics and automation solutions. KUKA is inviting industry professionals to experience the very best in robotic automation, software innovation and customer service by visiting stands 18-530 and 17-370 at this year’s MACH [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260312_Kuka.jpg"><img class="alignright size-medium wp-image-10533" src="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260312_Kuka-300x225.jpg" alt="260312_Kuka" width="300" height="225" /></a><a title="KUKA" href="https://www.kuka.com" target="_blank">KUKA Robotic</a> is set to make a major impact at MACH 2026, the UK&#8217;s largest event for the manufacturing industry, with its cutting-edge robotics and automation solutions.</p>
<p>KUKA is inviting industry professionals to experience the very best in robotic automation, software innovation and customer service by visiting stands 18-530 and 17-370 at this year’s MACH event.</p>
<p>With over 125 years of engineering experience and expertise, KUKA continues to redefine industrial automation, offering manufacturers a wide range of cost-effective solutions. Visitors to the Automation UK show will have the opportunity to explore the company’s full portfolio of high-performance industrial robots, and an extensive suite of software and digital tools that support seamless automation integration.</p>
<p>KUKA’s presence at MACH 2026 will focus on how its advanced technologies help manufacturers across all sectors to boost efficiency, enhance product quality, reduce operational costs, and ultimately increase profitability &#8211; making automation accessible, scalable, and future-proof.</p>
<h4>A complete automation ecosystem</h4>
<p>At the heart of KUKA’s stand (18-530) will be a display of its portfolio of latest-generation industrial robots, showcasing the precision, speed, and adaptability that modern production lines demand. From compact six-axis models ideal for intricate assembly tasks, to heavy-duty robots capable of handling large payloads, KUKA’s hardware reflects the company&#8217;s commitment to quality and versatility.</p>
<p>Complementing its robot range, KUKA will also showcase its new iiQKA.OS2 robot operating system, my.KUKA online customer portal, and demonstrate the iiQoT remote condition monitoring and iiQWorks engineering software platforms, among others. Available via the KUKA digital sphere, these software tools allow for intuitive programming, simulation, and integration, reducing commissioning time and providing real-time performance insights that drive smarter decision-making on the factory floor.</p>
<p>The KUKA stand will also include a live demonstration of its Advanced Welding capabilities in the shape of a desktop friction welding machine built especially for the show.</p>
<p>At MACH 2026, Visual Components will co-exhibit with KUKA, showing how manufacturing simulation and offline robot programming help bring automation projects from concept to reality.</p>
<p>Visual Components will present its simulation and robot offline programming solutions, enabling manufacturers to explore production concepts, test layouts, and evaluate different scenarios in a virtual environment before anything is built. By programming robots directly from the virtual model, teams can reduce errors, improve planning accuracy, and shorten commissioning and ramp-up time.</p>
<h4>Expert educational insight</h4>
<p>In addition to the exhibits on Stand 18-530, KUKA will also contribute to the show’s educational programme, providing the opportunity to have a hands-on experience on Stand 17-370 in the Education and Development Zone. Here, visitors and students will be able to try the latest KUKA.AMR Fleet software, designed to programme and control AMRs, and learn how they are transforming intralogistics by enabling flexible, intelligent material transport within complex production environments.</p>
<p>KUKA will also provide a live robot demonstration in the Automation &amp; Robotics Knowledge Hub during the show where attendees will gain insight into how advanced robotics can improve consistency, quality, and throughput in their manufacturing processes, while reducing waste and manual labour challenges.</p>
<p>These sessions will be ideal for manufacturing engineers, production managers, and decision-makers looking to stay ahead of the curve in their automation strategies.</p>
<p>Visit the KUKA website for more information</p>
<p>See all stories for KUKA</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/03/kuka-to-showcase-latest-robots-and-software-at-mach/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>ABB delivers industrial-grade physical AI at scale</title>
		<link>https://www.roboticsupdate.com/2026/03/abb-delivers-industrial-grade-physical-ai-at-scale/</link>
		<comments>https://www.roboticsupdate.com/2026/03/abb-delivers-industrial-grade-physical-ai-at-scale/#comments</comments>
		<pubDate>Tue, 10 Mar 2026 08:37:11 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[ABB Robotics]]></category>
		<category><![CDATA[All News]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[ABB]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[NVIDIA]]></category>
		<category><![CDATA[Omniverse]]></category>
		<category><![CDATA[robotics]]></category>
		<category><![CDATA[RobotStudio]]></category>
		<category><![CDATA[simulation]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10525</guid>
		<description><![CDATA[ABB Robotics is integrating NVIDIA Omniverse libraries into ABB Robotics’ RobotStudio to help manufacturers deploy physical AI in real world robotics applications. “Today, using NVIDIA accelerated computing and simulation technologies, we have removed the last barriers to making industrial and physical AI a reality at a global scale by closing the sim-to-real gap,” said Marc Segura, President of ABB Robotics. [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260310_ABB.jpg"><img class="alignright size-medium wp-image-10526" src="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260310_ABB-300x225.jpg" alt="260310_ABB" width="300" height="225" /></a><a title="ABB Robotics" href="https://go.abb/robotics" target="_blank">ABB Robotics</a> is integrating NVIDIA Omniverse libraries into ABB Robotics’ RobotStudio to help manufacturers deploy physical AI in real world robotics applications.</p>
<p>“Today, using NVIDIA accelerated computing and simulation technologies, we have removed the last barriers to making industrial and physical AI a reality at a global scale by closing the sim-to-real gap,” said Marc Segura, President of ABB Robotics.</p>
<p>“For more than 50 years, ABB Robotics has led the evolution of intelligent industrial automation, from pioneering the first generation of fully electric industrial robots to advancing digital twin simulation through RobotStudio and shaping a new area of autonomous and versatile mobile robots. Today’s announcement with NVIDIA brings physical AI to industry at scale.”</p>
<p>The collaboration focuses on combining ABB Robotics’ software programming, design and simulation suite, RobotStudio, with the physically accurate simulation power of NVIDIA Omniverse libraries to close technology&#8217;s long-standing &#8216;sim-to-real’ gap. Developers can simulate robots in digital twins and generate synthetic data to train their physical AI models, enabling businesses of all types and sizes to deploy AI-driven robotics for various industrial workflows.</p>
<p>Called RobotStudio HyperReality, the resulting physically accurate simulations and foundation models are endlessly optimised with real-world data feedback continuously improving the system. These models can be used to train any number of ABB robots, anywhere in the world, with the reliability and accuracy demanded by industry.</p>
<p>“The industrial sector needs physically accurate simulation to bridge the gap between virtual training and the real-world deployment of AI-driven robotics at scale,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Integrating NVIDIA Omniverse libraries into RobotStudio brings advanced simulation and accelerated computing to ABB Robotics’ unique virtual controller technology, accelerating how manufacturers of all sizes bring complex products to market.”</p>
<h4>Closing the ‘sim-to-real’ gap</h4>
<p>The long-standing deficit between simulation accuracy and real-world lighting, materials and environments is known as the ‘sim-to-real’ gap. For decades, this gap has limited the ability of manufacturers to design and develop advanced manufacturing processes in the virtual world.</p>
<p>By integrating NVIDIA Omniverse libraries into RobotStudio, ABB Robotics will deliver unprecedented robotics simulation and synthetic data generation capabilities that will allow intelligent robots to bridge this gap with up to 99 percent accuracy. ABB is the only robot manufacturer with a virtual controller running the same firmware as the hardware, ensuring near-perfect correlation between simulation and real-world performance. Combined with ABB Robotics’ Absolute Accuracy technology, which reduces positioning errors from 8-15 mm to around 0.5 mm, ABB delivers unmatched precision in both virtual and physical environments, making it suited to high-precision industrial-grade applications.</p>
<p>This innovation enables manufacturers to design, test, and optimise production lines virtually, cutting setup and commissioning times by up to 80 percent, reducing costs by up to 40 percent by eliminating the need for physical prototypes, and accelerating time-to-market for complex products such as consumer electronics by 50 percent.</p>
<p>ABB Robotics is also assessing the potential to integrate the NVIDIA Jetson edge computing platform into its Omnicore controller to achieve real-time AI inference at the edge for its extensive robot portfolio. The announcement builds upon ABB Robotics’ long-standing work with NVIDIA, including the previous integration of NVIDIA Jetson into ABB Robotics’ VSLAM autonomous mobile robots as well as the development of gigawatt-scale AI data centers.</p>
<h4>Real-world applications, today</h4>
<p>RobotStudio HyperReality will serve industrial clients at any scale, across a breadth of industries and applications, with select customers already testing its capabilities ahead of a full release to ABB Robotics’ 60,000 RobotStudio customers worldwide in the second half of 2026.</p>
<p>Foxconn, the world’s largest electronics contract manufacturer, is piloting the first joint use case in consumer electronics assembly. Automating the assembly of a tiny piece in consumer electronics is challenging, as multiple device variants require different production methods and the delicate metal structure requires precise pick-and-place and assembly control, as well as fine-tuned setup, often demanding additional debugging time and engineering resources. Using RobotStudio HyperReality, Foxconn’s assembly robots are trained virtually, using synthetic data to perfect multiple real-world production processes in various scenarios, before moving them to the production line with 99 percent accuracy. By optimising production lines virtually, Foxconn will reduce set-up times and costs by eliminating physical training and tests, and accelerate time-to-market for consumer electronics.</p>
<p>“Precision is everything in consumer electronics manufacturing and until now, this level of accuracy and fidelity just wasn’t possible in simulation and digital twins,” said Dr. Zhe Shi, Chief Digital Officer of Foxconn. “We’re incredibly excited by the potential of ABB Robotics and NVIDIA’s collaboration, which enables parallel engineering for better designs, faster production ramp‑up and greater product evolution through advanced AI inference and understanding.”</p>
<p>WORKR, a California based robotic workforce company that delivers robotic manufacturing solutions to industry, is extending the reach of this technology to small and medium manufacturers across the United States. At NVIDIA GTC 2026, WORKR will demonstrate AI-powered robotic systems built on ABB technology, trained with synthetic data using NVIDIA Omniverse libraries, and deployed without operators needing to know any programming. By combining ABB&#8217;s industrial grade robotics with its proprietary WorkrCore AI platform, the company is helping manufacturers address critical labour shortages with its robotic workforce that can learn new tasks in minutes and be operated by anyone.</p>
<p>&#8220;This collaboration is about making industrial AI deployable today,” said Ken Macken, CEO and Founder of WORKR. “Together with ABB and NVIDIA, we&#8217;re proving that advanced automation can work for manufacturers of any size.&#8221;</p>
<p>Visit the ABB Robotics website for more information</p>
<p>See all stories for ABB Robotics</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/03/abb-delivers-industrial-grade-physical-ai-at-scale/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Integrating advanced perception and AI</title>
		<link>https://www.roboticsupdate.com/2026/03/integrating-advanced-perception-and-ai/</link>
		<comments>https://www.roboticsupdate.com/2026/03/integrating-advanced-perception-and-ai/#comments</comments>
		<pubDate>Thu, 05 Mar 2026 07:28:31 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Cognibotics]]></category>
		<category><![CDATA[robot calibration]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10500</guid>
		<description><![CDATA[Cognibotics has announced that a leading global industrial robot manufacturer has placed an order valued at approximately SEK 16 million for its CogniCal robot calibration technology. The order expands the customer&#8217;s use of CogniCal in robot production, and reinforces the growing role of calibration in industrial robot manufacturing. CogniCal&#8217;s high-accuracy robot performance compensates for geometric [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260305_Cognibotics.jpg"><img class="alignright size-medium wp-image-10502" src="https://www.roboticsupdate.com/wp-content/uploads/2026/03/260305_Cognibotics-300x239.jpg" alt="260305_Cognibotics" width="300" height="239" /></a><a title="Cognibotics" href="https://www.cognibotics.com" target="_blank">Cognibotics</a> has announced that a leading global industrial robot manufacturer has placed an order valued at approximately SEK 16 million for its CogniCal robot calibration technology. The order expands the customer&#8217;s use of CogniCal in robot production, and reinforces the growing role of calibration in industrial robot manufacturing.</p>
<p>CogniCal&#8217;s high-accuracy robot performance compensates for geometric deviations, elastic effects, and payload influence. The result is improved accuracy, reduced variance between robots, and shorter start time for end customers.</p>
<p>This order reflects rising market demand for motion and calibration software that strengthens both traditional robot performance as well as future perception and AI-driven robotic applications.</p>
<h4>Precision as infrastructure for scalable automation</h4>
<p>Robot accuracy directly impacts integration cost, system uptime, and AI training robustness. Without calibration, geometric differences between individual robots introduce inconsistencies that require costly per-robot adjustments. In AI-driven systems, the same deviations adds noise that reduces the transferability of trained skills across a fleet. CogniCal minimises these variations, enabling:</p>
<ul>
<li>Faster commissioning and deployment</li>
<li>Reduced engineering effort and rework</li>
<li>Improved robot-to-robot consistency</li>
<li>Higher reliability across production sites</li>
<li>More transferable AI-trained skills across robot fleets</li>
</ul>
<p>&#8220;Accuracy is foundational to scalable automation. As robotics moves toward perception and AI-driven workflows, consistency between robot individuals becomes critical. This order confirms that calibration is no longer optional infrastructure – it is strategic,&#8221; said Fredrik Malmgren, CEO of Cognibotics.</p>
<h4>Strengthening Cognibotics&#8217; software footprint</h4>
<p>The deployment reinforces Cognibotics&#8217; position as a provider of high-value motion software to the robotics industry. CogniCal is a foundational layer in Cognibotics&#8217; tech stack, supporting more efficient ways to build and scale robot applications.</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/03/integrating-advanced-perception-and-ai/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Calibration for next-generation AI cobots</title>
		<link>https://www.roboticsupdate.com/2026/01/calibration-for-next-generation-ai-cobots/</link>
		<comments>https://www.roboticsupdate.com/2026/01/calibration-for-next-generation-ai-cobots/#comments</comments>
		<pubDate>Tue, 13 Jan 2026 08:09:50 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[Case studies]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[calibration]]></category>
		<category><![CDATA[cobot]]></category>
		<category><![CDATA[Cognibotics]]></category>
		<category><![CDATA[CogniCal]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10357</guid>
		<description><![CDATA[Standard Bots, an AI-driven robotics company building collaborative robot arms in the United States, has selected Cognibotics’ CogniCal calibration for its next-generation cobot platform. CogniCal delivers world-class accuracy, making robots easier to integrate and commission across a wider range of tasks. With accurate robots, training of AI driven robot tasks becomes more efficient and transferable [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/01/260113_Cognibotics.jpg"><img class="alignright size-medium wp-image-10358" src="https://www.roboticsupdate.com/wp-content/uploads/2026/01/260113_Cognibotics-300x168.jpg" alt="260113_Cognibotics" width="300" height="168" /></a>Standard Bots, an AI-driven robotics company building collaborative robot arms in the United States, has selected Cognibotics’ <a title="Cognibotics calibration" href="https://www.cognibotics.com/en/products/mptoolsuite" target="_blank">CogniCal calibration</a> for its next-generation cobot platform.</p>
<p>CogniCal delivers world-class accuracy, making robots easier to integrate and commission across a wider range of tasks. With accurate robots, training of AI driven robot tasks becomes more efficient and transferable between robot individuals; Critical for scalable intelligence.</p>
<p>Standard Bots is known for its RO1 cobot and a new 30 kg, 2 m reach model. Both combine 6-axis arms with integrated AI, 3D vision, and no-code programming.</p>
<p>“Cognibotics delivered the most robust and production-ready calibration technology we tested, outperforming both open-source and commercial alternatives. Their solution becomes a key enabler in how we scale Standard Bots,” said Evan Beard, CEO of Standard Bots.</p>
<h4>Precision that scales across robots and sites</h4>
<p>Robot accuracy makes application building easier and integration of, for example, cameras more valuable. CogniCal limits the difference between robot individuals, caused by geometry deviations and payload influence; phenomena that otherwise introduces noise in the AI system training data, or require costly adaptations of the robot programs in the classical case.</p>
<p>“AI is changing how robots are programmed, but factories still rely on precision, uptime, and predictability. That’s where Cognibotics comes in. By combining Standard Bots’ AI-first cobots with our calibration, manufacturers get robots that are as accurate and stable as traditional high-end systems, but deployable at software speed,” said Fredrik Malmgren, CEO, Cognibotics</p>
<p>For end-users, this means shorter commissioning times, better predictability and AI-trained skills that are easily transferred between robots and sites.</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/01/calibration-for-next-generation-ai-cobots/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Learn Al-ready robot programming for free</title>
		<link>https://www.roboticsupdate.com/2026/01/learn-al-ready-robot-programming-for-free/</link>
		<comments>https://www.roboticsupdate.com/2026/01/learn-al-ready-robot-programming-for-free/#comments</comments>
		<pubDate>Mon, 05 Jan 2026 08:24:43 +0000</pubDate>
		<dc:creator><![CDATA[Editor]]></dc:creator>
				<category><![CDATA[All News]]></category>
		<category><![CDATA[Robot programming]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Cognibotics]]></category>
		<category><![CDATA[course]]></category>
		<category><![CDATA[Juliet]]></category>
		<category><![CDATA[Lund University]]></category>
		<category><![CDATA[robot programming]]></category>

		<guid isPermaLink="false">https://www.roboticsupdate.com/?p=10327</guid>
		<description><![CDATA[Lund University and Cognibotics have launched a free Coursera course in modern motion programming, with an AI module in development, making industrial-grade robot programming accessible to anyone with a laptop. In a world increasingly shaped by automation and robotics, the need for clear, reliable and accessible knowledge in the field is growing. Lund University, together [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roboticsupdate.com/wp-content/uploads/2026/01/260105_Cognibotics.jpg"><img class="alignright size-medium wp-image-10328" src="https://www.roboticsupdate.com/wp-content/uploads/2026/01/260105_Cognibotics-300x168.jpg" alt="260105_Cognibotics" width="300" height="168" /></a>Lund University and Cognibotics have launched a <a title="Coursera Juliet robot programming" href="https://www.coursera.org/learn/juliet" target="_blank">free Coursera course</a> in modern motion programming, with an AI module in development, making industrial-grade robot programming accessible to anyone with a laptop.</p>
<p>In a world increasingly shaped by automation and robotics, the need for clear, reliable and accessible knowledge in the field is growing. Lund University, together with Cognibotics, has launched “The Juliet Language for Motion Programming” as a community impact course on Coursera – making it freely available to anyone with a computer and a Coursera account. With this initiative, Lund University and Cognibotics are taking a concrete step toward democratising robotics.</p>
<p>As robotics takes on a larger role in society, the demand for people with in-depth knowledge of motion programming is increasing just as rapidly. Yet for learners such as students, hobbyists, and independent developers, the current robotics ecosystems present three major obstacles. First, many of the robot programming languages used today were built decades ago, which makes it difficult to apply modern software and workflows. Second, each major industry player has their own solution, often non-modular and incompatible with others. And third, many tools are proprietary, meaning that there is usually a paywall to learn and use those languages.</p>
<p>Because of this, many newcomers to robotics turn to modern and free-to-use software instead. However, they often struggle because they lack the proper understanding needed to program motion safely, interactively, and in a way that reflects real industrial requirements.</p>
<p>“Today, working with robots often demands a rare mix of control theory, software engineering, and domain expertise,” said Fredrik Malmgren, CEO, Cognibotics. “In demanding warehouse applications, where Juliet &amp; Romeo is already in use, partners describe it as ‘plug and play’ compared to traditional systems. This course helps close the skills gap by making industrial-grade motion programming accessible to the many more students and engineers – it’s a concrete step toward democratising robotics.”</p>
<p>Juliet is a brand new robot programming language developed by Cognibotics together with Estun Automation. Inspired syntactically by its sister language Julia, Juliet combines the expressive power of modern programming languages with decades of proven industrial expertise in safe and interactive motion control. Behind Juliet stands Romeo, the real-time runtime that executes Juliet code and fulfils industry requirements on robot programming in terms of robustness, predictability, and user interaction. Together, Juliet &amp; Romeo provide a proper foundation for teaching and practicing motion programming in a way that aligns with both industrial demands and contemporary software workflows.</p>
<p>The Juliet Language for Motion Programming provides insight into robot programming from a theoretical perspective and explains how important aspects of programming robots are reflected in Juliet &amp; Romeo. The teachers of the course, Amina Gojak, Philip Olhager, Klas Nilsson and Sandra Collin, bring a unique combination of expertise: technical knowledge of Juliet &amp; Romeo, extensive software experience in Julia and motion programming, hands-on experience with world-leading robot programming languages, and decades of academic insight into what it truly takes to program robot motion in an effective, safe and optimal way.</p>
<p>&#8220;For the university, it&#8217;s important that students don&#8217;t just learn robotics in theory, but also see how modern tools are used in real industrial systems,&#8221; said Klas Nilsson, CTO, Cognibotics, and Senior Lecturer in Robotics and Semantic Systems, Lund University. &#8220;This course lets learners anywhere in the world explore motion programming through a language and runtime that reflect today&#8217;s robotics challenges.&#8221;</p>
<p>As a next step, The Juliet Language for Motion Programming is focusing on extending its course by another module with an AI focus. The AI module will teach how to utilize Juliet &amp; Romeo for Physical AI applications.</p>
<p>The course is freely available and open to anyone interested in understanding how modern robot programming works – whether they are just beginning their journey in robotics or seeking to deepen their expertise.</p>
]]></content:encoded>
			<wfw:commentRss>https://www.roboticsupdate.com/2026/01/learn-al-ready-robot-programming-for-free/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
