Search
197 results found with an empty search
- Electric VTOL Aircraft
Unique cooling solution is optimized to provide cooling for an electric Vertical Take Off and Landing aircraft capable of traveling flight distances of 1,000 km. Electric VTOL Aircraft Unique cooling solution is optimized to provide cooling for an electric Vertical Take Off and Landing aircraft capable of traveling flight distances of 1,000 km. Terry Persun Cool Stuff Jun 10, 2025 Conflux Technology designs, engineers, and produces additive manufactured heat exchangers for a variety of thermal challenges in multiple industries. Recently, they unveiled a collaboration with AMSL Aero, an Australian aircraft manufacturer involved in building the world’s most efficient long-range zero emissions electrical VTOL aircraft. Also from EE: Electric Race Car Uses 3D-Printed Components Under the first phase of the project to develop hydrogen fuel cell cooling for AMSL Aero’s Vertiia VTOL aircraft, Conflux developed three heat exchanger concepts, each focusing on minimizing weight and volume while managing continuous heat loads and reducing drag. The ultimate goal was to enable flight distances of up to 1,000 km. Already at its second phase, the company will optimize the design and manufacture of a full proof-of-concept assembly to evaluate its performance within Vertiia’s hydrogen fuel cell powertrain. According to Michael Fuller, CEO & Founder of Conflux Technology, ““Hydrogen fuel cells represent a transformative technology in Australia’s pursuit of sustainable energy solutions. We are proud to incorporate our heat exchange technology to enhance the efficiency and performance of Vertiia’s hydrogen fuel cells. Together, we’re advancing innovation in creating world-leading sustainable air transport.” The Conflux cooling solution will be optimized to provide cooling for high transient heat loads experienced during vertical take-off, landing, and hover operations. Weight, performance and packaging size are key constraints for aeronautical hydrogen powertrains. Geometrical freedoms granted by additive manufacturing means heat exchangers for these systems can be lightweight and conform to the space available. Conflux’s unique thin-walled, patented designs deliver thermal performance and low drag. Photo: A closeup of Conflux Fins AMSL Aero chairman Chris Smallhorn said: “In Vertiia we are building a hydrogen-electric aircraft that flies record-breaking distances at Formula 1 speeds, making Conflux Technology, with its storied history of innovation in motorsport and aviation, the perfect partner for us. Conflux’s AS9100D manufacturing and quality certification is critical in enabling Vertiia to become the world’s first long-range passenger-capable hydrogen VTOL.” Working together, the companies are pioneering a future where clean energy and cutting-edge engineering drive the aviation industry toward a greener and more sustainable future. This engagement further expands Conflux’s presence in aerospace applications, leveraging additive manufacturing to develop high-performance heat exchangers for next-generation aviation. Applications now extend across propulsion system cooling, transmission and gearbox cooling, environmental and avionics cooling, and power electronics cooling. As the demand for sustainable solutions accelerates, additive manufacturing is setting new performance benchmarks, delivering advanced thermal management solutions that conventional methods cannot achieve. Watch: News video of the Vertiia For more information: Conflux Technology AMSL Aero Previous Facebook LinkedIn Copy link Next
- Explore the International Space Station with Immersive VR Experience
This VR experience features realistic and intuitive navigation, helpful advice from Mission Control, and compelling and visceral science-related content. Explore the International Space Station with Immersive VR Experience This VR experience features realistic and intuitive navigation, helpful advice from Mission Control, and compelling and visceral science-related content. Edited by EE Staff Cool Stuff Oct 28, 2025 Every person on the planet, young and old alike, has wondered about our place in the universe. The idea of space creates a deep yearning for answers and experiences in all of us, and it serves as a common, unifying thread for humanity. NASA wanted to bring the magic of space travel to everyone and teamed up with Magnopus to bring Mission: I SS into reality and to help promote these common feelings through thrilling immersive technology. Mission: ISS lets users explore the International Space Station in detail to understand what it’s like to be an astronaut in a way that’s never before been possible. Based on NASA models and honed with input from astronauts who have lived in space, Mission: ISS recreates the International Space Station in painstaking detail. Users can experience how to move and work in zero-gravity, use space tools, dock a space capsule, and take a spacewalk. Images courtesy of Magnopus, NASA, and Meta. Real astronauts provide a sense of presence through instructional video clips. With strong STEM-related themes, anyone can take part in experiments and actual missions on the station. Since the experience is fully immersive, users feel a sense of weightlessness that can only be felt in VR, similar to what astronauts feel—a fact that Mission: ISS’s astronaut advisors have confirmed. In fact, more than one astronaut said it was like making a return trip! VR experiences are ideal for taking users to places that are too dangerous or too expensive to go. With that in mind, Mission: ISS was designed to remove those barriers—literally anyone with a VR headset can get a taste of space exploration. Images courtesy of Magnopus, NASA, and Meta. Mission: ISS is a non-profit initiative. It is freely available on Meta’s Quest store and has been demonstrated at science exhibits, international conferences, fairs, and exhibitions across North America and Europe to wide acclaim. The experience has won multiple awards including those from the Television Academy, VR Awards, XR Awards, and was even nominated for an Emmy. For more information: Magnopus NASA Meta Read more about applications in space >>> Previous Facebook LinkedIn Copy link Next
- Custom Software and Touch Screen Technology Used for Wine Tasting Experience
A first-of-its-kind interactive digital touch table and software package is custom built for wine tasting. Custom Software and Touch Screen Technology Used for Wine Tasting Experience A first-of-its-kind interactive digital touch table and software package is custom built for wine tasting. Jim Spadaccini, Founder & Creative Director of Ideum Cool Stuff Jun 16, 2025 I am thrilled to be the first to announce that Ideum has released a new touch table called the Tasting Table . It comes bundled with software that facilitates wine tasting. The Wine Experience allows guests to learn more about the tasting process and share what they experience. The software provides a new way to experience wine, demystifying and deepening the tasting experience—part of the company’s Sensory Dining experience. Photo: Nebbiollo ripens on the vine in the author's backyard vineyard. It is not hyperbole to say that we’ve been working on this new interactive experience for decades. The last project I managed at the Exploratorium in San Francisco was the Science of Wine project, during which I was introduced to Ann Noble, the U.C. Davis Professor who developed the Wine Aroma Wheel, a custom and highly modified version of the open-source Wine Aroma Wheel which is important in The Wine Experience application. We’ve made it interactive, where guests can select and share the flavors they are tasting. Analytics are built into the software, so a winery that uses this application can see what their guests are experiencing, effectively crowd sourcing their tasting notes. How it Works The Wine Experience software is paired with our new Tasting Table, developed explicitly for tasting rooms, popup events, and other spaces where wine tastings happen. The Tasting Table has a unique design: It is bar-height and has an optically bonded 55-inch touch display. An onboard computer makes the system plug-and-play. The system is lockable, and the software only requires an internet connection to load new content. In addition, the experience uses our object-recognition system, Tangible Engine. This software allows the interactive coasters that are part of the Tasting Table to recognize up to eight different wines. Tangible Engine was the first object-recognition software package for projected-capacitive touch tables; many design firms, including ours, use it. Over the years, we developed experimental applications that involved tasting. The Interactive Coffee Experience was designed with Starbucks and appeared at several pop-up events. We also developed an interactive wine-tasting experience, the JCB Tasting Salon with JCB Wines. More recently we worked with MSC Cruises to create the Interactive Wine Bar for the Euribia cruise ship. These experiences and the great and knowledgeable collaborators we worked with have contributed to our thinking as we developed this exciting new wine-tasting experience. In addition, we worked closely with VARA Winery & Distillery, who provided excellent feedback during our extensive testing and tasting sessions! Photo: Ambassador and former NM Senator Tom Udall & entrepreneur Steve Case visit Ideum and try an interactive coffee tasting developed with Starbucks. Our mission was to create a guest-centric experience, focusing on individuals tasting the wine. We want to enhance the tasting experience, making the interactive less of a brochure for the winery and more about the social experience surrounding tasting and the joy of tasting fine wines. I’ve been interested in wine since working in a wine shop and a restaurant in my early twenties. I also have a small vineyard in Corrales, New Mexico, with 150 vines I’ve been growing for over a decade. This project blends my personal and professional interests like no other endeavor has. We’ve had our first pop-up event with VARA, with more planned, and will have our first permanent installation at the New Mexico Wine Association’s new tasting room in Old Town later this year. For more information: See the video Ideum Tangible Engine Sensory Dining Website Wine Aroma Wheel VARA Winery & Distillery New Mexico Wine Association Previous Facebook LinkedIn Copy link Next
- The Groundbreaking Technology Behind Disney's New Robotic Olaf
How Disney Imagineering Research & Development brought a robot into the real world that walks and talks just like the beloved animated character from Frozen. The Groundbreaking Technology Behind Disney's New Robotic Olaf How Disney Imagineering Research & Development brought a robot into the real world that walks and talks just like the beloved animated character from Frozen. Terry Persun Theme Parks Jan 13, 2026 Per a recent press release, Disney Imagineering Research & Development has recently brought Olaf into the physical world as a fully free-walking robotic character. To do this, there were several real-world challenges, such as translating a stylized, animated character with non-physical movement into a believable real-world figure. At the outset, it was obvious that traditional robotics approaches were not going to work simply because Olaf’s proportions, motion style, and expressive requirement differed significantly from typical walking robots. Challenges included creating Olaf’s large, heavy head while supporting it by a very slim neck. Then there were the small snowball feet with no visible legs leading to an animated walk cycle that doesn’t follow realistic physics. Finally, there was the challenge of adjusting to the high sensitivity to noise, jitter, or awkward impacts that could easily break the illusion of life. Even small issues like loud footsteps or stiff motion were found to immediately reduce believability, making this one of the most demanding character robotics projects Disney has attempted. To preserve Olaf's on-screen appearance, the team designed a compact robotic structure that was completely hidden beneath the costume. They used a novel asymmetric six-degree-of-freedom leg system with one leg inverted relative to the other. The legs were totally concealed under the soft polyurethane foam skirt to create the illusion that Olaf’s feed moved freely along his body. The flexibility of the foam snowballs was able to absorb impacts and allow recovery steps. The design allowed Olaf to walk naturally while keeping all of the mechanical elements out of view. Reinforcement Learning Rather than programming Olaf’s movements by hand, the imagineering team relied on reinforcement learning guided by animation references. Artists first created stylized walking and standing animations and then used them to train AI policies in simulation. Separate standing and walking policies user used along with a reward system that focused on matching animation, maintaining balance, and staying within physical limits of the robot. Training also included real-time puppeteering through an animation engine that blends idle motion, triggered gestures, and joystick control. This approach allowed Olaf to move in a way that closely matched his animated personality, rather than simply walking like a typical robot. Solving Noise and Overheating Problems Two practical issues proved especially challenging: footstep noise and overheating. To address sound, researchers introduced a special impact-reduction reward that smooths foot motion during contact with the ground. Testing showed this reduced average stepping noise by 13.5 decibels, without significantly changing Olaf's gait. Overheating, especially in the neck where small actuators support Olaf's heavy head, the team developed a thermal-aware control policy that fed real-time actuator temperature into the AI system. This allowed the system to adjust motion to reduce torque before temperatures reached unsafe levels. This approach slightly relaxes animation accuracy when needed to protect the hardware, enabling Olaf to perform extended movement without damaging internal components. Expressive Face, Mouth, and Arms Beyond walking, Olaf's expressiveness comes from a separate set of "show functions" that control fully articulated eyes and eyelids, moving his mouth to look like he’s talking, and moving his arms—done through hidden spherical linkages. All of these elements are controlled using classical methods rather than reinforcement learning, allowing precise facial and gesture animation layered on top of the walking system. Many costume elements, including the carrot nose and arms, are magnetically attached so they can safely detach during a fall. Olaf represents a new benchmark for non-robotic character believability. While the system was built specifically for Olaf, the techniques developed—including asymmetric leg design, thermal-aware AI policies, and sound-reducing motion control—can be applied to future Disney characters. As Disney continues to preview Olaf's upcoming debut in parks overseas, the research makes it clear that this is only an early step toward a broader lineup of expressive, autonomous characters. The self-walking Olaf will debut at World of Frozen in Hong Kong Disneyland and at Walt Disney Studios Park in Paris in 2026. * Image courtesy of Disney. For more information: Disney Disney Hong Kong-Frozen Disney Paris-Frozen Read more about theme parks >>> Previous Facebook LinkedIn Copy link Next
- How a Deep Sea Camera Servo Drive Endures Extreme Pressures
Designing the right servo drive for camera motion in deep sea explorations meant the drive had to experience extreme pressures up to 8,800 psi. How a Deep Sea Camera Servo Drive Endures Extreme Pressures Designing the right servo drive for camera motion in deep sea explorations meant the drive had to experience extreme pressures up to 8,800 psi. Edited by EE Staff Cool Stuff Dec 2, 2025 At 6,000 meters below the surface of the sea, pressures can reach 8,800psi. The Titanic is only 3,800 meters below the surface. Equipment, such as the camera motion device, used at that depth often has to be encased inside a thick sealed container. When the customer approached ADVANCED Motion Controls (AMC) about using their servo drives without a sealed container, the customer suggested an alternative approach—submerging the electronics in a non-conducting oil bath. After speaking with AMC’s applications engineers the customer was encouraged to “give it a try” even though AMC had never tested their devices at the required depths. The customer purchased a standard DigiFlex servo drive and performed high-pressure testing of the device while submerged in the oil bath. The drive held up considerably well except that the standard electrolytic capacitors (shaped like small cans) were being crushed by the pressure. The customer’s solution during the testing process was to drill a small hole in the capacitor housing and allow the oil to equalize the pressure. This worked perfectly for the prototype but as a manual modification would be costly and inefficient in a production setting. Images courtesy of ADVANCED Motion Controls. After some design considerations, the AMC applications engineering team provided a custom solution using high-pressure tolerant, solid-state capacitors, which could easily handle the pressure naturally and without needing additional modification. For feedback, the customer proposed incorporating a magnetic encoder chip, which required a magnet on the rotating shaft and a sensor chip positioned precisely above it. The AMC team chose to design a custom daughterboard that the drive could plug into. The daughterboard held the sensor chip in perfect alignment with the motor shaft magnet—solving the mechanical difficulty of alignment and saving the customer from having to build their own brackets—reducing the mounting footprint. The customer required that the main communications between the drive and the host was set up using standard RS-232. Images courtesy of ADVANCED Motion Controls. Through a committed approach to make the design work, and by overcoming multiple challenges, AMC was able to work with their customer to produce the perfect solution to a difficult project. The underwater camera became a successful and integral product for undersea explorations. Cross industry applications for AMC technologies are explained in this video: For more information: ADVANCED Motion Controls DigiFlex Drive Servo Drive Selector Other underwater applications >>> Previous Facebook LinkedIn Copy link Next
- WATCH: One-on-One with Joe Rando of Rando Productions
A conversation with Joe Rando on how teams of creative engineers bring entertainment to life in stage, film, theme parks, and more WATCH: One-on-One with Joe Rando of Rando Productions A conversation with Joe Rando on how teams of creative engineers bring entertainment to life in stage, film, theme parks, and more EE Staff Stage Events Feb 16, 2026 Theme Parks Terry Persun sat down with Joe Rando, whose company, Rando Productions provides custom technology for major entertainment productions that include theme parks, television, and much more. Rando explains how, in an AI age, it remains essential to employ teams of creative engineers to create entertainment experiences that tell stories and connect with audiences on an emotional level. Persun and Rando dive deep into the nitty gritty behind-the-scenes of how Rando Prodictions works year round on custom technology to consistently deliver specially designed machines and technology in specific timeframes. Watch the full interview here: For more information: Rando Productions Previous Facebook LinkedIn Copy link Next
- iPhone Lenses Used in Film Shoot
Van Nuys, California, June 25, 2014—When Sebastian Lindstrom, co-founder of What Took You So Long, and his team journeyed to Liberia for their latest documentary, they employed an iPhone 5s equipped with the iPro Lens System® by Schneider Optics iPhone Lenses Used in Film Shoot Van Nuys, California, June 25, 2014—When Sebastian Lindstrom, co-founder of What Took You So Long, and his team journeyed to Liberia for their latest documentary, they employed an iPhone 5s equipped with the iPro Lens System® by Schneider Optics Terry Persun Film and TV Jun 3, 2025 “We are a small documentary production company that specializes in supporting non-profits and development entities around the world,” explains Lindstrom. “Our method of filmmaking depends on high-quality, lightweight equipment.” For the documentary about women with obstetric fistula, which was co-funded by the United Nations Populations Fund, the team took two Canon 5Ds to the West African country. And for the most audience-engaging results, they also took a 7D and an iPhone to capture slow motion and extreme close-up footage. “We shot with the iPhone 5s ─ mostly in 720p at 120-fps ─ and iPro Super Wide and Macro lenses,” he says. “The Macro captures the same details as a $1000 lens would. It’s amazing how close you can get with it ─ so close, we were able to position it just inches from a person’s iris,” which is shown in the opening sequence of the video. On the opposite end of the spectrum, “The iPro Super Wide frames a much bigger picture of the world than the native camera inside the iPhone,” Lindstrom adds. “While at a stadium in Monrovia for a big soccer game between Liberian and Ghanaian teams, the president of Liberia came out on the field to wish everybody ‘good luck.’ The iPro Super Wide lens enabled us to spontaneously capture innovative wide-angle shots for slow-motion content that will take audiences’ imaginations to the next level when watching our videos. In retrospect, we should have left the 7D at home, because the daylight footage we caught with the iPhone and iPro lenses was superior ─ and it was much easier to pack.” While the DSLR revolution gave the world access to smaller, less expensive cameras, Lindstrom notes that “Filming with a phone takes it to another level, as you can quickly position angles that your DSLR would require a jib for, and it’s something most people travel with anyway. And you could never get smooth traveling shots by holding your DSLR outside of a car window with your hand like we did with the iPhone. We believe that the iPro Lens System used in conjunction with the iPhone 5s for slow motion filming has the potential to become an important, value-added component for any type of documentary work. We plan to use them in all our upcoming shoots around the world as an integral part of our DSLR filming.” For the past five years, What Took You So Long has worked within more than 70 developing countries. “Some of the areas we travel to may not take kindly to visiting filmmakers,” he says. “So it’s very beneficial to be able to shoot with the iPhone ─ now with professional-grade lenses that are easy to conceal.” For more information: What Took You So Long Home Schneider Optics Lenses Previous Facebook LinkedIn Copy link Next
- 60 Stage Configurations Supported by Flexible Automation
Beckhoff provides the automation flexibility and reliability needed to convert walls, floors, and backdrops according to performance schedules. 60 Stage Configurations Supported by Flexible Automation Beckhoff provides the automation flexibility and reliability needed to convert walls, floors, and backdrops according to performance schedules. Edited by Terry Persun Stage Events Sep 12, 2025 The Perelman Performing Arts Center in New York (PAC NYC) offers visitors a truly unique theater experience. Advanced stage technology makes its three performance spaces extremely versatile. At the foot of Manhattan’s One World Trade Center building and across from the 9/11 Memorial & Museum stands the PAC NYC. While it offers programming similar to other major New York City theaters, the mission of this gathering space is distinctly communal. “PAC NYC is a place of civic healing,” says Miranda Palumbo, Director of Digital Content at PAC NYC. “Because we are on the World Trade Center campus, it's our responsibility to help everyone celebrate life.” The venue features three performance spaces that can flexibly combine or divide into over 62 configurations. The backstage technology also supports dynamic set changes and flying performers through the air. To harness the necessary engineering behind the performance art, PAC NYC directed The Chicago Flyhouse, Inc. and its programming partner, ELPLANT, to implement a safe, reliable, and flexible stage automation system. Flyhouse provides rigging, hoisting, and performer flying equipment for venues across the world ranging from hospitals and high schools to theaters and arenas. Distributed Control Flyhouse incorporated its distributed “MoM-and-Kid” control concept where a central server, Master of Machines,(MoM), communicates to distributed modules (the Kids). The more than 30 Kid modules at PAC NYC each have their own Beckhoff CX9020 Embedded PC and EtherCAT I/O wired to control Flyhouse’s ZipLift hoists and other equipment. The modules can be easily moved, connected to other hoists, or swapped for maintenance. Image courtesy of Beckhoff. The large number of Kid modules and their associated motion axes throughout the theater level raised the bar on the facility’s networking capabilities. The Flyhouse technologies also needed to interface with other vendors’ solutions such as the systems to raise and lower the massive walls or change the floor configuration to be flat or stairstep up. This meant that safety zones had to adjust dynamically as spaces changed to ensure human and equipment safety. “Even though the duty cycles are relatively short in theaters, we needed the reliability that comes with industrial automation.” Beckhoff supplied an ideal solution. The EtherCAT and PC-based control technology provides a foundation for seamless operation and high adaptability. Flyhouse collaborated with Beckhoff USA and ELPLANT to design next-generation control modules. ELPLANT, an ISO 9001-certified systems integrator based in Serbia, brought expertise in industrial automation and entertainment applications. CEO of ELPLANT, Aleksandar Arsić , explained, “Beckhoff was undoubtedly the logical choice, as few systems could provide such a modular and configurable architecture.” The system incorporated TwinCAT PLC, NC PTP motion control, TwinSAFE safety systems, extensive EtherCAT communication, TwinCAT PLC visualizations, TwinCAT HMI, database communication, and ADS with third-party applications, such as C# WPF (Windows Presentation Foundation) operator consoles and similar solutions. Real-time communication allowed the team to configure the topology so that each embedded PC or other EtherCAT device operated as an independent sync unit. Much of the equipment also features EtherCAT P, which combines data and power on one cable. This configuration allows techs to remove or add Kid modules without taking all the others offline. Beyond sheer speed and robust diagnostics, the EtherCAT supports free selection of topology. It also offers hot connect functionality and automatic addressing of devices, simplifying component exchange and plug-and-play installation. Flyhouse also harnessed integrated functional safety with TwinSAFE terminals. Here, safety information is transmitted via Safety over EtherCAT (FSoE) over the standard EtherCAT network, rather than a separate, hardwired system. Beyond the reduction in wiring effort and cost, TwinSAFE simplified implementation of the configurable theater concept. Flyhouse deployed its Ease® Control Console in each theater space, simplifying axis operation with joysticks and a multi-touch screen with a visualization built with TwinCAT Human-Machine Interfaces (HMIs). The consoles can’t access axes outside the operator’s line of sight for safety reasons, so, when raising walls to combine spaces, the consoles need to control all the axes in that larger room. Likewise, E-stop buttons need to halt all motion in combined spaces if required, meaning that the MoM-and-Kid architecture must change on the fly. This could have been incredibly complex to implement, but with the flexibility of EtherCAT and software capabilities in TwinCAT, it was seamlessly implemented. The modular system will continue to support upgrades, and with a scalable, future-proof automation platform, this process won’t require a rip-and-replace of infrastructure. Instead, technicians can simply make changes in software or replace a device with a newer version. Beyond reducing costs, this approach avoids unwanted intermissions to find obsolete components. To Mark Witteveen, when the lights dim and the stage comes to life, the audience isn’t thinking about automation, he says. “They’re simply immersed in the magic. And that experience makes all the effort worthwhile.” For more information: Beckhoff PAC NYC Flyhouse Elplant Previous Facebook LinkedIn Copy link Next
- Mushrooms Playing Music? How an Engineer and a Musician Turn Bioelectric Signals into Art
Bionic and the Wires uses bionic arms, bio-sensors, and electronic instruments to create music from plants and fungi. Mushrooms Playing Music? How an Engineer and a Musician Turn Bioelectric Signals into Art Bionic and the Wires uses bionic arms, bio-sensors, and electronic instruments to create music from plants and fungi. Nicole Persun Music Nov 11, 2025 Cool Stuff Somewhere in the woods outside Manchester, a mushroom is hooked up to electrodes and bionic arms and given a synthesizer. In response to the fungi’s bioelectric signals, the mallet-like arms knock on the synthesizer and create music. Bionic and the Wires is an artistic project that blends technology and nature. It was created by Jon Ross, a multi-disciplinary eco-artist, technologist, and environmental thinker, and Andy Kidd, a musician with a background in electronic music. Jon brings the “how” with the technology, and Andy brings the “what” with the sound design. The result is strange, otherworldly music that’s intended to make the viewer think differently about the natural world. The inspiration for the project came, Jon says, “from two key areas: the emerging scientific understanding of non-human intelligence (e.g. in fungi and plants), and a desire to experiment with music.” Jon and Andy have made music together for nearly ten years, but things changed when they started running simple bio-sonification experiments by connecting sensors directly to synthesizers. “The critical leap came when I had the idea for the bionic arms in 2024,” Jon says, “enabling plants and fungi to play real life instruments.” This allowed these organisms to become active creators rather than simply passive subjects. “By giving plants and fungi ‘hands,’ we challenge the exclusive human claim to artistic creation and invite profound reflection on the unique essence of human consciousness versus the intelligence found throughout nature.” At its heart, Bionic and the Wires is meant to “foster a deeper connection with the living world.” Andy Kidd (left) and Jon Ross (right) playing music with a peace lily plant. All photos courtesy of Bionic and the Wires . How it works The music is created through various components, including bio-sensors, bionic arms, electronic instruments, and — of course — the plants and fungi. The primary sensor is a biosonification device (MIDI Sprout) from Electricity for Progress, which operates on a galvanometer-style circuit. When clipped onto a leaf or the bell of a mushroom, the sensor detects minute magnetic fluctuations in the plant or fungi’s electrical conductivity, which is impacted by the nutrients and water that make up its physiological state. In other words, the sensors detect the plant’s “mood” based on its electrical charge. The electrical activity of the plant is then translated into MIDI signals, which are fed into bionic arms custom-engineered by Jon. This allows the plants and fungi to “control” the motion. The final piece is the music technology. “We utilize a combination of traditional and electronic musical instruments as the final output devices for the plants' signals,” Jon says. While the rhythm comes from the plants and fungi, Andy’s artistic role is in deciding how to translate the motion with the synthesizers. For routing, they use Ableton Live, a digital sound software. Aloe with a keyboard. All photos courtesy of Bionic and the Wires. The intersection of technology and nature Different plants and mushrooms yield different results, and Jon and Andy have experimented with a wide variety. “Some have a much faster signal response than others,” Jon adds. “We choose the plant/fungi based on what type of music we want to make.” Bionic and the Wires shares music on YouTube and other social platforms. “Our art serves as a bridge, making complex scientific concepts about bio-electricity and plant cognition accessible and understandable,” Jon says. It’s the intersection of nature and technology that makes it possible. Their vision for the future of Bionic and the Wires stems from its original idea: “We hope to continue pushing the boundaries of what it means to be an artist and who gets to create,” Jon says, “with a future goal of solidifying the recognition of plants and fungi as creative entities.” For more information: Bionic and the Wires Bionic and the Wires on YouTube Electricity for Progress Ableton Live Read more articles about music >>> Previous Facebook LinkedIn Copy link Next
- Film Studios and 3D Printing
3D Printing is a game-changer in the movie and digital EFX industry. Large build volume and reliability at an affordable price for stunning special effects design has become a no-brainer investment. Film Studios and 3D Printing 3D Printing is a game-changer in the movie and digital EFX industry. Large build volume and reliability at an affordable price for stunning special effects design has become a no-brainer investment. Terry Persun Film and TV Jun 17, 2025 Veteran Makeup FX Artists Steve Yang and Eddie Wang from Alliance Studio—an entertainment design and build studio—discuss how 3D printing with Raise3D has shaped the new era of special effects and sculpture creation. Steve Yang: I moved to L.A. to get into the makeup effects industry. This was a time when the movie The Thing had come out already, American Werewolf in London , The Howling . These were huge innovations in makeup effects. I was lucky enough to get in with Stan Winston Studios when I first got here and work for him for a few months. Then I went to Rick Baker’s as a sculptor for Harry and the Henderson’s. And I think, it wasn’t until shortly after that I met Eddie. He was 17. Here’s this amazing kid, really talented, and he wanted to meet me. We instantly hit it off and have been friends ever since. Eddie Wang: Steve had this completely unique way of doing things. At the time everything was humanoid, what we called a “safe design.” It was all monster paint jobs using purples and flesh tones. Everything was done in a very boring fashion. And then Steve showed up to the monster maker contest with this hermit crab inspired, sea creature amphibian paint job with this samurai kilt underneath everybody was like, “Oh my God, it’s beautiful, it’s designed well, it’s something we’ve never seen before.” Steve Yang: In those days, everything we did in the industry was started with clay. We’d do maquettes, but our final products were always done in clay. And when I slowly moved out of the makeup effects arena, I started getting more into creating statues for video games. The first one I did was for Blizzard back in 2004, and it wasn’t until 2010 when they came to me, and they had this giant robot guy named Jim Raynor, a guy in the robot suit and they show me the 3D model and say, “We want you to make this.” At that point, it was totally different from what I’ve done before. Before everything was done traditionally, and now I’ve got this giant robot. And so that’s the first time that I really got into digital. Digital printing is a huge part of what we do now. A while back digital printing was something relatively new, now it’s everywhere. Every studio has a 3D printer. It just makes so much more sense. They are so much easier to work with. Plus it’s a one-to-one operation. You design stuff on the computer digitally and you get exactly what you designed. The first printer we ever bought was a Makerbot, but it was too small, and we needed something bigger, we need solutions. So, we started researching into larger printers, and we looked at every company. I think we were on a tour of Blizzard Studios when Brian Faison said you should look into Raise3D. Sure enough we contacted John over at Raise3D and he had a printer here ASAP. He rolled it out of the truck, plugged it in, and showed us how to use it. That was pretty much the history. What we got with Raise3D was a bigger build space and higher resolution. We were able to actually make some of the parts we needed for the big life-size statues. We print them, take them to the back, clean them up, and do a regular finishing on them. You can’t tell the difference between that and the parts that we actually farm out to the big printing companies. You do one job, and it pays for the printer. Oh yeah, the price was a huge thing because the ones we were looking at were two and three times more than a Raise3D printer. We were expecting to purchase one system, but we were able to afford two of them. Since then, we have recommended Raise3D to so many people. As artists, and being creative every day, we think of unique ways of utilizing this technology and the machinery. We can do things that we have never done before. See conversation video here: https://www.youtube.com/watch?v=BYd7eS4o2ws&embeds_referring_euri=https%3A%2F%2Fwww.raise3d.com%2Fcase%2Fthe-advanced-tech-behind-movies-heres-why-every-film-studio-now-owns-a-3d-printer%2F See Alliance Studios video here: https://www.youtube.com/watch?time_continue=14&v=ZDGIuNLDvXM&embeds_referring_euri=https%3A%2F%2Falliancestudios.gg%2F&embeds_referring_origin=https%3A%2F%2Falliancestudios.gg&source_ve_path=Mjg2NjIsMjg2NjY For information: Raise3D: Https://www.Raise3d.com Raise3D Demo Videos: https://www.raise3d.com/demo-video/ Alliance Studios: https://alliancestudios.gg/ Previous Facebook LinkedIn Copy link Next
- Rigging Control Consoles
J.R. Clancy expands their rigging control console series to address the entire range of performance venues. Rigging Control Consoles J.R. Clancy expands their rigging control console series to address the entire range of performance venues. EE Staff Stage Events Jun 3, 2025 The family of SceneControl® 5000 controllers now has a console for performance spaces of every size, from the most complex performing arts center, sporting complex, and Broadway show to the high school, church, and college stage. The addition is the SceneControl® 5200 console, which provides programming capability for an unlimited number of cues, including the ability to control other stage machinery like turntables, wagons, chain motors, and even performer flying hoists. Small enough for venues with confined backstage space, the SceneControl 5200 console controls up to 24 axes. Venues using the SceneControl 5200 console will enjoy greater flexibility, expanding their ability to incorporate different kinds of equipment for exciting effects. If the venue’s needs change, the unit can be scaled up as required, because the SceneControl family has one across-the-board, flexible, fully compatible control system architecture. Upgrading or adding additional consoles becomes plug-and-play. The family of consoles includes three controllers for larger venues, offering the flexibility to grow with the changing needs of any venue: The SceneControl® 5600 offers dual 24-inch 1920 x 1080 screens, redundant parallel processing, and a 15.6-inch helm capacitive touch screen on the desktop. Greater video real estate means the operator can see more of what is going on within the system at any given time, with 3D wireframe views that display the performer space and all of its axes. Redundant parallel processor ensures that a hardware failure does not stop the show, and 3D programming makes it possible to fly performers or scenery on X, Y, and Z axes. SceneControl® 5500 provides many of the same features, plus a 10.1-inch master helm touch screen and 24-inch monitor screen to allow the operator to use the 3D wireframe view to see all the equipment in motion. This controller also features 3D programming capability for exciting performer flying effects. Redundant processing is included in the event of a hardware failure. Finally, the SceneControl® 5300 is a compact, mid-level operator console that can serve as the main operator interface as well as a localized backstage console. The 15.6-inch touch screen provides an intuitive, easy to use interface. As new productions and touring shows reach greater levels of complexity, performing arts centers will be able to integrate the same production values seen in New York or Las Vegas into their in-house rigging control system through the use of the SceneControl® 5000 series. For more information: J.R. Clancy Home Previous Facebook LinkedIn Copy link Next
- Printing Sushi in Space? It's Not As "Out There" As You Think
Unique dispenser technology can produce various types of sushi at the press of a button. Printing Sushi in Space? It's Not As "Out There" As You Think Unique dispenser technology can produce various types of sushi at the press of a button. Muge Deniz Meiller Cool Stuff Oct 21, 2025 The phrase “micro fluid dispensing” is generally associated with applications like medical device assembly or battery manufacturing. It certainly doesn’t conjure up visions of sushi—at least not yet. If engineers at IHI Aerospace and Yamagata University have their way, though, 3D printed sushi will be served to space tourists as they circle in low Earth orbit. IHI Aerospace is involved in developing a commercial space platform that could be used to carry civilians into orbit. The company is already looking ahead to enhancing all levels of the experience, including meals—in particular, sushi. IHI had to look beyond specialty chefs, sharp knives, and coolers of fish and seafood and decided to print the sushi with a lightweight countertop micro dispensing system. Considering that adventurers looking for the thrill of orbital spaceflight will expect an unforgettable experience which includes something more exotic than just a sandwich, IHI Aerospace reached out to Yamagata University, which has a strong aerospace engineering program and an equally well-regarded culinary arts program. After some brainstorming, the University team chose proteins in a paste form rather than as solid fish or seafood. Uni (sea urchin) and other fish pastes are common food items in Japan and many parts of the world. Thus, the concept of sushi made with uni paste is familiar. Pastes have benefits for both quality and logistics. The proteins are harvested and packed at the peak of flavor. Plus, packaged pastes are shelf stable with no leftover food waste to generate odor. Protein pastes are also compatible with non-contact micro fluid dispensing technology, making it possible to automate the sushi preparation. All photos courtesy of Nordson. The Challenges Developing printable sushi was an innovative concept and presented a number of challenges. The application required a specific volume of uni paste to be dispensed on a bed of rice in a specific pattern and location. Uni paste is a high-viscosity fluid that requires well-controlled pressure to dispense. The nozzle needed to be wide enough to discourage blockages but narrow enough to provide controlled deposition. In addition, the goal of the program was to create a system to produce four different kinds of printed sushi in paste form: uni, white fish, crab, and shrimp. The system needed to be able to toggle from one to another without flavor residue. Further, in the event of blockage, the nozzles needed to be cleanable. To tackle these challenges and build their prototype, the Yamagata University team turned to Nordson EFD Japan. By integrating a Nordson EFD PICO Pulse piezo jetting valve technology with a compact robot, the group created a precision micro fluid dispensing system capable of printing sushi that rivals products from the local sushi bar. The unit can be installed in a galley and produce various types of sushi with the press of a button. All photos courtesy of Nordson. Piezo valves are very high-resolution and reliable, with long lifetimes. These characteristics enable the user to tailor stroke length, precisely controlling the amount dispensed. This characteristic equips the PICO Pµlse jet valve to optimize deposition to achieve a uniform appearance for sushi pastes with different consistencies. The PICO Pµlse is a modular product, offering great flexibility and enabling it to be configured ideally for each application. A tool-free latch enables the fluid body to be exchanged rapidly and easily. Rapid exchanges are as useful during prototyping as they are once the product is in operation. The ability to swap out fluid carrying parts quickly allows the valve to serve its purpose of dispensing different types of pastes and being easy to clean. The IHI Aerospace/ Yamagata University team combined the PICO Pµlse valve with the Nordson EFD PICO Touch valve controller and fluid reservoir for an end-to-end solution that combined ease of integration with accurate, reliable operation. The next step was to choose the optimal nozzle to handle the protein pastes. Nordson EFD recommended a flat nozzle with a 300-micron orifice. This nozzle has a wide enough aperture to ensure smooth, controlled deposition of the protein pastes while minimizing the risk of blockages. This nozzle was covered with a special hydrophilic coating used for sticky fluids. It reduced surface tension of the wetted pathway for improved micro dispensing consistency. While printable sushi for orbital meals is an admittedly exotic use case, printable food in general could have a much broader impact. The Yamagata University team, for example, hopes to continue to explore the technology for food service in facilities like hospitals, nursing homes, and long-term care facilities. For more information: Nordson EFD Nordson Pico Pulse Valves IHI Aerospace Yamagata University Read more about food technology >>> Previous Facebook LinkedIn Copy link Next












