top of page

Search

159 results found with an empty search

  • Complex Communication Needs at TED's Live Events Require Seamless Comms

    Tailored to TED’s high-stakes live event environment, the new communications upgrade provides robust, real-time capabilities across both on-site and remote production teams. Complex Communication Needs at TED's Live Events Require Seamless Comms Tailored to TED’s high-stakes live event environment, the new communications upgrade provides robust, real-time capabilities across both on-site and remote production teams. Edited by EE Team Stage Events Oct 27, 2025 TED Conferences LLC recently partnered with Clear-Com® to enhance its live production capabilities through a custom intercom solution centered on the Arcadia® Central Station communications platform. Clear-Com’s Applications team, represented by John Ferrante, led the installation at TED’s New York production facility, configuring an end-to-end system designed to support the complex communication needs of TED conferences. The Arcadia Central Station, upgraded to the latest firmware release, was integrated with an extensive suite of FreeSpeak II® digital wireless products, IP transceivers, wireless beltpacks, and remote connectivity via Clear-Com’s Gen-IC® virtual intercom. All photos courtesy of TED Conferences LLC. As part of the upgrade, TED also transitioned from its legacy analog wired intercom to the HelixNet® Digital Network Partyline System. This configuration streamlined on-demand communications across multiple teams, making the system adaptable for both live shows and distributed production workflows. The heart of TED’s new communications system integrates seamlessly with its existing network, allowing Clear-Com equipment to connect with other audio systems across the venue with minimal latency and pristine audio fidelity. Integrated IP and Dante-enabled channels further expand TED’s ability to manage communications in real time. This setup enables TED to retain established communication flows while introducing the flexibility and scalability of Arcadia as the organization gradually transitions to its new infrastructure. Gen-IC adds another layer of remote connectivity to TED’s operations. Six IVC links connect remote team members directly into TED’s on-site Arcadia infrastructure, delivering seamless integration between local and remote participants. This functionality is supported by an LQ® Series IP Interface connected to the Gen-IC Cloud, which provides TED’s production team with dynamic control over both remote and local communications channels. Gen-IC’s ability to easily integrate remote users has been transformative, enabling TED to remain agile while maintaining the highest standards of audio quality. All photos courtesy of TED Conferences LLC. Through on-site training, TED’s team can now configure channels, assign users, and manage remote links, providing a future-proofed system that scales with the organization’s growth. This workflow reflects the unique challenges TED faces as a hybrid live event and content production organization, blending elements of broadcasting, streaming, and high-end meeting production—demands that resonate with today’s cutting-edge producers of corporate AV events. Clear-Com’s installation of Arcadia, along with the migration from analog to HelixNet digital wired comms, deployment of FreeSpeak II beltpacks, and integration of Gen-IC, equips TED with a flexible, reliable communication solution that supports the present and future demands of live event production, remote collaboration, and content distribution. This empowers TED’s talented team to deliver programming with greater efficiency and quality. For more information: Clear-Com TED Conferences Arcadia Central Station HelixNet Partyline System LQ Series Read more about stage productions >>> Previous Facebook LinkedIn Copy link Next

  • Making WALL-E Look Battered

    The real life WALL-E that visits newsrooms, tradeshows, and goes on media junkets had to look like the animated WALL-E from the movie. It appears that, like life, there’s a process that has to be gone through in order to look older. Making WALL-E Look Battered The real life WALL-E that visits newsrooms, tradeshows, and goes on media junkets had to look like the animated WALL-E from the movie. It appears that, like life, there’s a process that has to be gone through in order to look older. EE Staff Film and TV Jun 4, 2025 Because the WALL-E from the movie has been around for a long time, his body had been weather worn and beat up by the work he does – compacting trash and stacking it neatly. Computer animation allowed animators to create the look and feel of a well-worn WALL-E, but transferring that same look and feel to a ‘real’ robot was another story. Pixar utilized sophisticated computer graphics to create the digital representation of this fun-loving robot. This digital data was well suited for rapid and precise fabrication of all of the external covers that comprise WALL-E. Although, the covers could have been created in a number of ways, the Disney Imagineering team chose to have the parts created on an SLA rapid prototyping system (see sidebar on the SLA process) The form, fit and overall appearance of the prototype SLA covers were validated with a working robot. The final covers for the traveling robot needed to be significantly tougher than the initial SLA covers. Advanced cast urethane covers were reproduced using a silicone tool created from the SLA masters. The cast urethane process has been accomplished a number of ways by a variety of companies. Although we don’t have information on which method was used for the job, here’s an explanation of what the process might look like. General practices for secondary processing of the master might look similar to what some services do for their customers. Here’s one method that may be used: First prepare a master pattern — created using SLA, CNC, or PolyJet technology — that is worked to a desired surface finish. Then carefully position tape in specific areas to create joint or parting line to assist with cutting the pattern out of the mold. After this step, a mold box is built to enclose the master pattern. The box size is minimized so that the poured Platinum-based Silicone material is not wasted. The master is elevated off the floor of the box to allow the Silicone to surround the master. The material is allowed enough time to cure and then the cured mold is cut into two halves and the master pattern removed. A two-part polyurethane liquid is mixed and then poured (with a proprietary pressure differential) into the mold. The polyurethane filled mold is then placed into a proprietary pressure oven and the final cast polyurethane part is allowed to fully cure thus achieving maximum mechanical properties. Finally, the top half of the mold is removed and the final cast part is removed from the mold. The stressing and rust texture for WALL-E was reproduced from the animated production using paint. “Exterior components – including the treads and details on the inside of the WALL-E camera eyes – were based on the movie data and placed ‘on-model’ to look as authentic as possible,” according to Akhil Madhani, Principal Technical Staff Director for Walt Disney Imagineering Research & Development. “For motion, the tracks are driven using custom designed brushless DC servomotors, which operate through planetary gearheads,” Akhil said. The remaining motors are standard brushed motors using a variety of reduction mechanisms. All the mechanisms themselves were custom designed, including the tracks and treads. As with WALL-E’s panels, the tread texture was copied from the movie models. Control software, as well as all the animation software, was written in-house at Disney and Pixar. This includes the system that allows the company to play Pixar-created animation on the physical robot in order to maintain its character. Designing the ‘real’ WALL-E was, as many Disney projects, highly proprietary, allowing only for general information to be discussed. Akhil did say that “every part of the system, including electronics, was included in the CAD model.” His team used Pro/engineer CAD software for design. Previous Facebook LinkedIn Copy link Next

  • The Groundbreaking Technology Behind Disney's New Robotic Olaf

    How Disney Imagineering Research & Development brought a robot into the real world that walks and talks just like the beloved animated character from Frozen. The Groundbreaking Technology Behind Disney's New Robotic Olaf How Disney Imagineering Research & Development brought a robot into the real world that walks and talks just like the beloved animated character from Frozen. Terry Persun Theme Parks Jan 13, 2026 Per a recent press release, Disney Imagineering Research & Development has recently brought Olaf into the physical world as a fully free-walking robotic character. To do this, there were several real-world challenges, such as translating a stylized, animated character with non-physical movement into a believable real-world figure. At the outset, it was obvious that traditional robotics approaches were not going to work simply because Olaf’s proportions, motion style, and expressive requirement differed significantly from typical walking robots. Challenges included creating Olaf’s large, heavy head while supporting it by a very slim neck. Then there were the small snowball feet with no visible legs leading to an animated walk cycle that doesn’t follow realistic physics. Finally, there was the challenge of adjusting to the high sensitivity to noise, jitter, or awkward impacts that could easily break the illusion of life. Even small issues like loud footsteps or stiff motion were found to immediately reduce believability, making this one of the most demanding character robotics projects Disney has attempted. To preserve Olaf's on-screen appearance, the team designed a compact robotic structure that was completely hidden beneath the costume. They used a novel asymmetric six-degree-of-freedom leg system with one leg inverted relative to the other. The legs were totally concealed under the soft polyurethane foam skirt to create the illusion that Olaf’s feed moved freely along his body. The flexibility of the foam snowballs was able to absorb impacts and allow recovery steps. The design allowed Olaf to walk naturally while keeping all of the mechanical elements out of view. Reinforcement Learning Rather than programming Olaf’s movements by hand, the imagineering team relied on reinforcement learning guided by animation references. Artists first created stylized walking and standing animations and then used them to train AI policies in simulation. Separate standing and walking policies user used along with a reward system that focused on matching animation, maintaining balance, and staying within physical limits of the robot. Training also included real-time puppeteering through an animation engine that blends idle motion, triggered gestures, and joystick control. This approach allowed Olaf to move in a way that closely matched his animated personality, rather than simply walking like a typical robot. Solving Noise and Overheating Problems Two practical issues proved especially challenging: footstep noise and overheating. To address sound, researchers introduced a special impact-reduction reward that smooths foot motion during contact with the ground. Testing showed this reduced average stepping noise by 13.5 decibels, without significantly changing Olaf's gait. Overheating, especially in the neck where small actuators support Olaf's heavy head, the team developed a thermal-aware control policy that fed real-time actuator temperature into the AI system. This allowed the system to adjust motion to reduce torque before temperatures reached unsafe levels. This approach slightly relaxes animation accuracy when needed to protect the hardware, enabling Olaf to perform extended movement without damaging internal components. Expressive Face, Mouth, and Arms Beyond walking, Olaf's expressiveness comes from a separate set of "show functions" that control fully articulated eyes and eyelids, moving his mouth to look like he’s talking, and moving his arms—done through hidden spherical linkages. All of these elements are controlled using classical methods rather than reinforcement learning, allowing precise facial and gesture animation layered on top of the walking system. Many costume elements, including the carrot nose and arms, are magnetically attached so they can safely detach during a fall. Olaf represents a new benchmark for non-robotic character believability. While the system was built specifically for Olaf, the techniques developed—including asymmetric leg design, thermal-aware AI policies, and sound-reducing motion control—can be applied to future Disney characters. As Disney continues to preview Olaf's upcoming debut in parks overseas, the research makes it clear that this is only an early step toward a broader lineup of expressive, autonomous characters. The self-walking Olaf will debut at World of Frozen in Hong Kong Disneyland and at Walt Disney Studios Park in Paris in 2026. * Image courtesy of Disney. For more information: Disney Disney Hong Kong-Frozen Disney Paris-Frozen Read more about theme parks >>> Previous Facebook LinkedIn Copy link Next

bottom of page