Source: Xinhua
Editor: huaxia
2026-04-18 14:11:45
by Yi Ling
BEIJING, April 18 (Xinhua) -- No celebrity. No influencer. No scripted drama. The unlikely star of a recent marathon livestream was a robot on a factory assembly line -- and it kept viewers glued for eight straight hours.
On Tuesday, AgiBot, a Shanghai-based embodied AI company, broadcast live from the intelligent manufacturing center of Longcheer, a leading smart device designer and manufacturer, in Nanchang, capital of east China's Jiangxi Province.
The camera followed a single robot, named AgiBot G2, as it worked a real production shift: picking tablets off a moving conveyor, inserting them into test fixtures, sorting good and bad units, and handing off finished products. No cuts, no replays. Just a machine doing its job.
And people watched. Tens of thousands tuned in, many staying far longer than typical livestream audiences. Comments flooded in: "It's hypnotic," "This is better than a game," "Finally, proof -- not a demo."
The fascination was not just with the robot itself. The live stream marked the world's first large-scale industrial implementation of embodied AI systems within core production workflows in consumer electronics manufacturing.
Smart factories and industrial robots are hardly new in China, which has deployed them for years. The real question has always been: can they step out of the lab and actually work -- reliably, day and night, on a real production line? Skepticism has lingered. Livestreaming the answer, raw and unedited, took unusual confidence. The companies behind the robot said that was exactly the point.
"We wanted people to see it with their own eyes," said Li Long, general manager of the robotics division at Longcheer, which hosted the test.
"We chose material handling as the entry point for embodied intelligence deployment because it offers the best scenario to test the boundaries of robot capabilities," said Li. "This was a real line, real pace, real pressure."
The result: The AgiBot G2, ran continuously for over eight hours with no major failure, achieving a task success rate above 99.5 percent. Each operation took 18 to 20 seconds, and the robot processed 310 units per hour -- matching two human stations.
It adapted to unexpected conveyor movements, handled position deviations on the line, and even flagged defective products. The machine also communicated in real time with the factory's manufacturing execution system (MES), the software that tracks and controls production. After the livestream, the robot had logged more than 140 cumulative stable hours on the line.
"We have entered a completely new phase. Embodied intelligence is no longer a laboratory concept. It has become real industrial productivity. The joint deployment by AgiBot and Longcheer proves this conclusively," said Yao Maoqing, president of the embodied business unit at AgiBot.
To understand why this matters, consider the difference between traditional robots and embodied AI.
Conventional industrial arms follow rigid, pre-programmed motions. They are fast but dumb: change the product model, and they often require new tooling, lengthy reprogramming, and days of downtime.
Embodied intelligence, by contrast, integrates AI into a physical system that perceives its surroundings, makes decisions, and adapts. The AgiBot G2, for instance, uses a three-degree-of-freedom waist and cross-wrist force-control arm with high-precision torque sensors. In plain language, it can "feel" when something is off and correct itself -- no requirement for mannual adjustment on force control parameters.
That flexibility has long been a holy grail for the 3C (computer, communication, consumer electronics) industry, where product life cycles are short, models change fast, and traditional automation is often too rigid and costly.
Li noted that embodied intelligence's learning capability gives it significant advantages in flexible deployment and rapid iteration.
The AgiBot G2 can be retooled for a new tablet model in under four hours of retraining, with 95 percent equipment reusability. Scene calibration takes as little as 15 minutes.
"Humanoid robots entering factories are not simply about replacement -- it's about collaboration," Li emphasized. "Between traditional robotic arms and embodied intelligence, the relationship right now is not 'substitution' but more 'synergy.' Combining the two can better empower us and meet customer needs."
In Li's view, this collaboration of Longcheer and AgiBot is not just a technology verification. It provides a replicable and scalable pathway for future smart manufacturing platforms.
The deployment sits on a solid policy foundation. In late 2025, China's eight central ministerial-level agencies jointly issued an "AI Plus Manufacturing" action plan, aiming to roll out 1,000 high?level industrial intelligence agents and promote 500 typical application scenarios by 2027.
In February, China issued its first national standard system for humanoid robots and embodied AI, covering the entire industrial chain. The country's 15th Five-Year Plan (2026-2030) also explicitly lists embodied intelligence as a core growth engine.
Several AgiBot G2 units are now running steadily at Longcheer's Nanchang plant. The company plans to scale to 100 units by the third quarter of 2026 and expand into automotive, semiconductor, and energy sectors, said Yao.
Regarding future challenges, Yao emphasized the importance of data accumulation.
"Embodied intelligence currently has very little data. We believe we may need on the order of 100 million hours of real?world and simulation data to reach the 'ChatGPT moment,'" he said.
For viewers who watched the eight-hour stream, the message was simple: China's smart manufacturing robots are not just hype. They have clocked in. And they are keeping their jobs. ■