Copyright Interesting Engineering

=LinkCraft automatically translates human gestures into precise robotic actions by integrating these technologies. “No programming skills or specialized equipment are needed,” said Zhihui Peng, Co-Founder, President, and CTO of AgiBot. “Simply upload a video of human movements, and LinkCraft handles the rest, transforming real-world actions into precise robot motion effortlessly. With this platform, we’re bringing professional-grade capabilities to everyday users,” he continued. The need for LinkCraft For years, the process of designing robot actions has remained complex, limiting participation to experts in programming and robotics. LinkCraft changes this paradigm. Users can now record a video on their smartphones and upload it to the platform, where the robot learns and replicates those movements autonomously. AgiBot plans to enhance imitation learning within LinkCraft to support fine motor control – including individual finger movements – to achieve even greater motion precision and realism. Enabling voice performance Beyond motion imitation, LinkCraft adds a Voice Performance feature that enables robots to speak and emote naturally. Users can upload audio, record online, or even type text, and the system’s multimodal AI synchronizes the robot’s movements, facial expressions, and tone for lifelike delivery. With built-in voice synthesis, creators can exercise greater control over multiple factors like tone, voice type, emotion, etc., to convey personality and mood. Users can create immersive, emotionally engaging performances with this feature, without the need for technical review. Bringing cinematic control to robotic storytelling Earlier, Agibot’s LinkCraft introduced Timeline Orchestration, a feature developed to enhance performance realism. Users can choreograph robots with frame-level precision using this feature. They can mix and sequence actions, voices, and facial expressions just like video clips, turning simple motions into cohesive narratives. Timeline editing tools also allow precise control over timing, transitions, and storytelling, enabling robots to deliver smooth, cinematic performances that feel human and engaging. Group editing and multi-robot control AgiBot’s LinkCraft platform has also introduced Device Linkage and Group Control to support large-scale performances and collaborative interactions, enabling multiple robots to perform in harmony. Ideal for commercial and entertainment settings, it also includes a library of 180 action sets and 140 expression templates for effortless content creation and sharing across users. About real-world deployment The platform is currently optimized for AgiBot X2, with future support planned for A2 and other models. The X2 is currently in mass production, with shipments expected to reach thousands of units in 2025. Backed by the LinkCraft platform, the AgiBot X2 has already been successfully integrated into entertainment, retail, research, and education environments.