Search
159 results found with an empty search
- Film Studios and 3D Printing
3D Printing is a game-changer in the movie and digital EFX industry. Large build volume and reliability at an affordable price for stunning special effects design has become a no-brainer investment. Film Studios and 3D Printing 3D Printing is a game-changer in the movie and digital EFX industry. Large build volume and reliability at an affordable price for stunning special effects design has become a no-brainer investment. Terry Persun Film and TV Jun 17, 2025 Veteran Makeup FX Artists Steve Yang and Eddie Wang from Alliance Studio—an entertainment design and build studio—discuss how 3D printing with Raise3D has shaped the new era of special effects and sculpture creation. Steve Yang: I moved to L.A. to get into the makeup effects industry. This was a time when the movie The Thing had come out already, American Werewolf in London , The Howling . These were huge innovations in makeup effects. I was lucky enough to get in with Stan Winston Studios when I first got here and work for him for a few months. Then I went to Rick Baker’s as a sculptor for Harry and the Henderson’s. And I think, it wasn’t until shortly after that I met Eddie. He was 17. Here’s this amazing kid, really talented, and he wanted to meet me. We instantly hit it off and have been friends ever since. Eddie Wang: Steve had this completely unique way of doing things. At the time everything was humanoid, what we called a “safe design.” It was all monster paint jobs using purples and flesh tones. Everything was done in a very boring fashion. And then Steve showed up to the monster maker contest with this hermit crab inspired, sea creature amphibian paint job with this samurai kilt underneath everybody was like, “Oh my God, it’s beautiful, it’s designed well, it’s something we’ve never seen before.” Steve Yang: In those days, everything we did in the industry was started with clay. We’d do maquettes, but our final products were always done in clay. And when I slowly moved out of the makeup effects arena, I started getting more into creating statues for video games. The first one I did was for Blizzard back in 2004, and it wasn’t until 2010 when they came to me, and they had this giant robot guy named Jim Raynor, a guy in the robot suit and they show me the 3D model and say, “We want you to make this.” At that point, it was totally different from what I’ve done before. Before everything was done traditionally, and now I’ve got this giant robot. And so that’s the first time that I really got into digital. Digital printing is a huge part of what we do now. A while back digital printing was something relatively new, now it’s everywhere. Every studio has a 3D printer. It just makes so much more sense. They are so much easier to work with. Plus it’s a one-to-one operation. You design stuff on the computer digitally and you get exactly what you designed. The first printer we ever bought was a Makerbot, but it was too small, and we needed something bigger, we need solutions. So, we started researching into larger printers, and we looked at every company. I think we were on a tour of Blizzard Studios when Brian Faison said you should look into Raise3D. Sure enough we contacted John over at Raise3D and he had a printer here ASAP. He rolled it out of the truck, plugged it in, and showed us how to use it. That was pretty much the history. What we got with Raise3D was a bigger build space and higher resolution. We were able to actually make some of the parts we needed for the big life-size statues. We print them, take them to the back, clean them up, and do a regular finishing on them. You can’t tell the difference between that and the parts that we actually farm out to the big printing companies. You do one job, and it pays for the printer. Oh yeah, the price was a huge thing because the ones we were looking at were two and three times more than a Raise3D printer. We were expecting to purchase one system, but we were able to afford two of them. Since then, we have recommended Raise3D to so many people. As artists, and being creative every day, we think of unique ways of utilizing this technology and the machinery. We can do things that we have never done before. See conversation video here: https://www.youtube.com/watch?v=BYd7eS4o2ws&embeds_referring_euri=https%3A%2F%2Fwww.raise3d.com%2Fcase%2Fthe-advanced-tech-behind-movies-heres-why-every-film-studio-now-owns-a-3d-printer%2F See Alliance Studios video here: https://www.youtube.com/watch?time_continue=14&v=ZDGIuNLDvXM&embeds_referring_euri=https%3A%2F%2Falliancestudios.gg%2F&embeds_referring_origin=https%3A%2F%2Falliancestudios.gg&source_ve_path=Mjg2NjIsMjg2NjY For information: Raise3D: Https://www.Raise3d.com Raise3D Demo Videos: https://www.raise3d.com/demo-video/ Alliance Studios: https://alliancestudios.gg/ Previous Facebook LinkedIn Copy link Next
- Creativity on a Monstrous Scale
Entertainment Engineering talks with the best-known and most highly respected special effects companies in the film business to discuss creativity, freedom, and herding cats. Dave Merritt is Model Shop Supervisor at Legacy Effects, and took time out of a very busy day to speak with us. Here is part of that conversation. Creativity on a Monstrous Scale Entertainment Engineering talks with the best-known and most highly respected special effects companies in the film business to discuss creativity, freedom, and herding cats. Dave Merritt is Model Shop Supervisor at Legacy Effects, and took time out of a very busy day to speak with us. Here is part of that conversation. Derek Wells Film and TV Jun 4, 2025 EE: First of all, how many men and women do you typically employ, and what are their specialties? Dave Merritt: Legacy Effects operates from a core group of about 45 people, and we can quickly ramp up to 150 people as more projects come in. We staff all types of specialists, but we separate them by departments; some departments include more than one specialist. The Departments are broken up into Art, 3D Modeling, Mechanical, Fabrication (which includes hair and fur), Electronics, Molding and Casting, and the Model Shop. EE: Creativity is an important aspect of Legacy’s production. How often do you start with one design and end up with something completely different based on inputs from different departments? Dave: We strive to create exactly what our clients desire, but there are times when our staff collaboration may change our approach to the final project. Typically our timeframes are short so a great deal of collaboration needs to happen during the project. For example, we may use materials and processes from one department to replace a more time consuming method in another department. EE: What percentage of your projects have some sort of motion involved? And do your designers typically use electric, pneumatic, or hydraulic components for the motion? Dave: About 50 percent of our work involves some sort of motion. That motion can incorporate everything from simple rod puppets to large hydraulic systems depending on the individual project. In Real Steel, the robot Atom was built with rod puppet arms and a hydraulic head which allowed for a smoother and more fluid operation. EE: For any single project, how many different designers are involved? Dave: Each department has input on the design of a project. For instance, 5 key designers from various departments were involved with the Iron Man suit. Our Fabrication Dept. dealt with how the materials would work together as a whole; Mechanical worked out hinge points and fasteners, while the Model Shop and Mold Dept. focused on the patterns and casting process. Then, Electronics followed up with the lighting and wiring harness. Now for something like the aliens in Cowboys and Aliens, the same process applied, but the puppet was more organic, so the Digital and Real World Sculptors, Mechanics and Mold Dept. were more predominate in the build. EE: How do you use the computers you have in-house, since you don’t provide CGI to your clients? Dave: We utilize 3D modeling in order to visualize what we are going to build and to identify specific elements that may go through the rapid prototyping process. We use our in-house system to produce maquettes for clients as well as small detail components. EE: How many projects do you work on at any one time? Dave: We typically run four to six television commercials and two to four feature films at one time. EE: If you had only a few words to explain how it is to work with such a diverse and talented group, what would you say? Dave: It's a very rewarding experience to be able to work in a creative environment with such talented people. EE: Thank you for taking the time to answer our questions. Dave: Thank you. For more information: Legacy Effects Home Previous Facebook LinkedIn Copy link Next
- Wicked Technology on Stage at Comic-Con
Blending creativity, innovation, and technology helped make this Comic-Con event light up. Wicked Technology on Stage at Comic-Con Blending creativity, innovation, and technology helped make this Comic-Con event light up. Edited by EE Staff Stage Events Nov 25, 2025 Held at the Manchester Grand Hyatt’s Seaport Ballroom, the audience members for the Her Universe Fashion Show experienced a Wicked -themed immersive lobby featuring pop ups such as a KISS nail booth, photo ops, film props and costumes, and LEGO big builds. From elaborate couture fashion and musical performances to a technology enhanced magical runway entrance by Eckstein herself, the event was designed to captivate and inspire. Her Universe images courtesy of Mark Edwards Photography. The runway serves as the stage for a group of selected designers who created and showcased their designs—some of which were inspired by the Wicked fandom—during the 2025 Her Universe Fashion Show. At the end of the night, Lynleigh Sato and Caitlin Beards were chosen by both the audience and an expert panel of judges as the winning designers. They were awarded a cash prize of $2,000 USD each and have been offered the opportunity to design a fashion collection with Her Universe. Central to this vision was the transformation of the stage into the World of Oz, a feat made possible by ALTRD Projections and Barco’s state-of-the-art projectors. Delivering Impactful Visuals This year’s production presented a unique spatial and creative challenge: how to deliver high-impact visuals across four separate projection surfaces. The team needed a solution that delivered consistent brightness, sharp resolution, and vibrant color across each of the four side screens—two on each side of the stage—while maintaining visual cohesion with the central LED display. For the immersive experience of the show, it was critical to attain seamless integration and high performance across all of these distinct surfaces. Four high-performance Barco UDX-4K40 projectors, known for their unmatched brightness, 4K resolution, and color fidelity, were instrumental in creating a cohesive, large-scale projection canvas of the four side screens that framed the runway. The UDX’ ability to deliver consistent, vibrant imagery across wide surfaces allowed the creative team at ALTRD Projections to design and execute a fully immersive runway backdrop. The visuals were crisp, dynamic, and emotionally resonant, bringing the magical world of Oz to life. Product image courtesy of Barco. The projectors handled complex motion graphics and color-rich content with ease, ensuring that every detail was rendered with cinematic precision. Barco’s technology didn’t just support the show’s visuals, it elevated them, enabling a level of storytelling and spectacle that matched the ambition of the designers and performers. The result was a flawless fusion of fashion and technology. Her Universe images courtesy of Mark Edwards Photography. The visuals created by the Barco projectors were meant to provide a wickedly memorable evening for attendees by elevating the entire production and immersing the audience in a fantastical experience that matched the creativity of the designs on the runway. The show drew record attendance and rave reviews, with fans and judges alike praising the visual storytelling and technical excellence. By blending creativity and innovation, Barco was a committed technology partner enabling Her Universe to push the boundaries of fashion and fandom once again. The Her Universe Fashion Show, now in its 11th year, has become a cornerstone of San Diego Comic-Con, celebrating fandom through fashion. Presented by Universal and hosted by Her Universe founder Ashley Eckstein, the 2025 edition was themed “Defying Fashion: Fashion That Defies Expectations” inspired by the popular Wicked franchise. This year’s show reached new heights in creativity, inclusivity, and visual spectacle. For more information: Barco UDX-4K40 FLEX Her Universe See other theatre and stage case studies >>> Previous Facebook LinkedIn Copy link Next
- Custom Aerial Rigging for Stage and Theatre
Multiple rigging systems and bungie-assisted motion bring 'Puppet Master - Into Thin Air' to life through immersive, multi-axis flight choreography. Custom Aerial Rigging for Stage and Theatre Multiple rigging systems and bungie-assisted motion bring 'Puppet Master - Into Thin Air' to life through immersive, multi-axis flight choreography. Gavin “Wild” Smith, Founder of Aero Motion Australia Stage Events Aug 5, 2025 Film and TV Aero Motion Australia designs and produces custom rigging systems for aerial performance across stage, theatre, circus, and screen. With deep roots in aviation and an unrelenting pursuit of visual wonder and technical mastery, the company continues to redefine what’s possible in the air—creating elegant, mechanical solutions for complex motion. For the theatrical performance, Puppet Master - Into Thin Air , Aero Motion developed a uniquely manual rigging system design capable of supporting dynamic, three-dimensional flight on multiple planes of axis, without relying on motors or automation. The system combined an overhead catenary or dual running slacklines, diversion pulleys and compound rigging pulleys with 4:1 advantage that assisted with the lifting of the aerial artist, and bungies to create an expressive movement language suited to the show’s surreal themes. At the core of the design was an overhead catenary dual rope running span line system or slackline engineered with mechanical advantage—a rope-and-pulley method that amplifies the operator’s lifting force, allowing the artist to be flown by hand. The performer was suspended from a rolling point mounted mid-span, enabling single-plane, multi-axis flight. This setup produced a pendulum-style motion across the stage, with the performer rising from an offset floor position interacting with a ground based artist and ‘floating’ through space. All photos courtesy of Aero Motion Australia To push beyond a linear flight path, a secondary tangential rigging system was introduced. This added an orbital layer of motion by tethering to the rolling point, allowing the performer to traverse complex arcs. Coordinating both systems demanded not just technical precision but also a skilled human counterweight operator—something Aero Motion believes can be trained within circus and stunt rigging disciplines. To assist with vertical movement along the Z-axis, custom-built bungy cords connected a corset-style harness to a circular spreader bar, providing elastic lift and responsive recoil that enhanced the rigging’s mechanical rhythm. These bungy cords are hand-crafted using raw materials similar to those found in AJ Hackett-style systems, tailored specifically for aerial performance applications. The construction method is intricate, carefully calibrated to achieve the desired elasticity, rebound characteristics, and overall dynamic response required for the choreography. While highly effective in delivering a fluid and energetic performance, the bungy rubber is inherently prone to wear. As the cords stretch and recoil, the energy exerted under tension generates internal heat, which gradually degrades the rubber from the inside out. Because of this, each bungy rope has a limited operational lifespan and must be closely inspected and monitored for signs of fatigue or damage before every use. Swivels enabled the spreader bar to spin freely. As the performer’s rotational speed increased, the system exhibited a striking physical effect—gyroscopic precession. This phenomenon is the change in orientation of the rotational axis of a spinning object when an external force is applied. Rather than moving in the direction of the force, the spinning object responds 90° later in the direction of its rotation, introducing new rigging geometry and layout design challenges. Aero Motion addressed this by offsetting the rig’s secondary tangential rigging system geometry to allign this to contain the full X-Y-Z motion envelope within a 15-meter (50-foot) high studio space. “The performer’s movement became both aerial and orbital—suspended in a mechanical ballet that was entirely human-powered,” says Gavin “Wild” Smith, Aero Motion’s founder. With the full spectrum of the rigging universe engaged, Puppet Master - Into Thin Air became a rare example of immersive, live performance engineering, where rigging, choreography, and physics converged. Technical Details for Engineers & Riggers: Primary System: Overhead running span line or slackline span, with 2:1 to 4:1 mechanical advantage pulley system, no automation, operated manually. Rolling point is a dual pulley on both overhead running catenary lines that rolls along the span, Kernmantle static rope 11mm diameter is used throughout the system Flight Domain: Single-plane pendulum flight with a radial rigging system manipulating the main system. Secondary System: Tangential tether to rolling point enables X-Y movement in arc, requires human counterweight operation for control. Vertical Control: Custom elastic bungies in parallel, fitted to a bespoke circular spreader. Rotation produces gyroscopic precession. Considerations: Operator training essential. Flight path management requires anchor point offsetting and clear stage-to-roof clearance of ~15m. No load cell integration. Relies on operator feel and controlled descent/lift. For more information: Aero Motion Australia Watch Puppet Master - Into Thin Air Author Bio: Gavin “Wild” Smith is the founder of Aero Motion Australia, a specialist in custom aerial rigging for circus, stunt, theatre and aviation environments. With a background in both engineering and mechanical systems as well as creative performance, his work bridges the gap between engineering precision and artistic expression. Gavin’s rigging designs have featured in live events, feature films, and high-risk helicopter operations around the world. Previous Facebook LinkedIn Copy link Next
- How a Deep Sea Camera Servo Drive Endures Extreme Pressures
Designing the right servo drive for camera motion in deep sea explorations meant the drive had to experience extreme pressures up to 8,800 psi. How a Deep Sea Camera Servo Drive Endures Extreme Pressures Designing the right servo drive for camera motion in deep sea explorations meant the drive had to experience extreme pressures up to 8,800 psi. Edited by EE Staff Cool Stuff Dec 2, 2025 At 6,000 meters below the surface of the sea, pressures can reach 8,800psi. The Titanic is only 3,800 meters below the surface. Equipment, such as the camera motion device, used at that depth often has to be encased inside a thick sealed container. When the customer approached ADVANCED Motion Controls (AMC) about using their servo drives without a sealed container, the customer suggested an alternative approach—submerging the electronics in a non-conducting oil bath. After speaking with AMC’s applications engineers the customer was encouraged to “give it a try” even though AMC had never tested their devices at the required depths. The customer purchased a standard DigiFlex servo drive and performed high-pressure testing of the device while submerged in the oil bath. The drive held up considerably well except that the standard electrolytic capacitors (shaped like small cans) were being crushed by the pressure. The customer’s solution during the testing process was to drill a small hole in the capacitor housing and allow the oil to equalize the pressure. This worked perfectly for the prototype but as a manual modification would be costly and inefficient in a production setting. Images courtesy of ADVANCED Motion Controls. After some design considerations, the AMC applications engineering team provided a custom solution using high-pressure tolerant, solid-state capacitors, which could easily handle the pressure naturally and without needing additional modification. For feedback, the customer proposed incorporating a magnetic encoder chip, which required a magnet on the rotating shaft and a sensor chip positioned precisely above it. The AMC team chose to design a custom daughterboard that the drive could plug into. The daughterboard held the sensor chip in perfect alignment with the motor shaft magnet—solving the mechanical difficulty of alignment and saving the customer from having to build their own brackets—reducing the mounting footprint. The customer required that the main communications between the drive and the host was set up using standard RS-232. Images courtesy of ADVANCED Motion Controls. Through a committed approach to make the design work, and by overcoming multiple challenges, AMC was able to work with their customer to produce the perfect solution to a difficult project. The underwater camera became a successful and integral product for undersea explorations. Cross industry applications for AMC technologies are explained in this video: For more information: ADVANCED Motion Controls DigiFlex Drive Servo Drive Selector Other underwater applications >>> Previous Facebook LinkedIn Copy link Next
- Electronic Waste is On the Rise — This Metal Recovery Technology is a Golden Solution
An automation system integral to successful precious metals recovery technology is now set for global rollout. Electronic Waste is On the Rise — This Metal Recovery Technology is a Golden Solution An automation system integral to successful precious metals recovery technology is now set for global rollout. Cool Stuff Aug 29, 2025 Electronic waste is one of the world’s fastest-growing waste streams, according to Statista market research. With e-waste generation forecast to exceed 80 million metric tons by 2030, improved recycling and recovery infrastructure is critical. One company on a mission to help offset this trend is The Royal Mint, UK’s oldest company. With a contemporary sustainability ethic at the heart of a long-term strategy, the company has invested in, enhanced, and scaled a hugely promising technology used for the recovery of precious metals and other materials from e-waste. At the start of the project, the extraction technology, developed by Canadian company Excir only existed in a prototype form. It required further development and scaling to hit The Royal Mint’s target of 4,000 metric tons per annum. Additional technological and processing hurdles included a tightly controlled scheduling and investment. All photos courtesy of Rockwell Automation and The Royal Mint. According to Rockwell Automation UK managing director Phil Dadfield, “This was never going to be a straightforward project. We knew there were multiple challenges, some of which we could not influence, but we also knew that our highly experienced process technical team and our family of tightly integrated technologies would bring the extraction process under tighter control. Once that was established, further evolution and eventual scaling would be a little more straightforward.” In The Royal Mint’s Precious Metal Recovery facility, circuit boards are fed via a conveyor system into a reactor and the resulting sludge then undergoes separation, sorting, and filtering using specialized chemical processes to extract molten gold and other precious metals from mixed materials, in a tightly controlled precipitation process. The unique chemistry extracts 99% of the gold in e-waste and, more importantly, this process takes place at room temperature, rather than high-temperature smelting. The system separates out more than just precious metals. Every part of a printed circuit board can be accounted for, even the fiberglass bi-product is “de-brominated—where hazardous bromide is removed—as an integral part of the Mint’s circular economy and related net-zero plans. All photos courtesy of Rockwell Automation and The Royal Mint. Rockwell Automation’s Lifecycle Services team delivered a complete process control solution based on the company’s PlantPAx® DCS system, which included operator workstations as well as engineer workstations. The overall system allows integration from plant floor instrumentation up to the boardroom, with contextualized reporting providing insights to drive actions for optimizing production. PlantPAx proved ideal for this project due to its ability to scale in line with the growth and evolution of the process. It also connects disparate pieces of equipment in the complex plant and controls them in one place, using an interface familiar to the Mint’s engineers, who have worked with Rockwell Automation before. Hadfield explains, “The system architecture enables different vendors to manufacture different parts of the plant and then easily plug it all together and give one, plant-wide infrastructure control when it's up and running. This approach eliminates disparate control systems that you often have in projects like this and provides optimization improvements, such as common log-ons, change management, alarm management, data logging and more.” All photos courtesy of Rockwell Automation and The Royal Mint. Even with the initial uncertainty and multiple process variables, engineers from all parties were delighted when the system delivered gold on the first run, at scale, and without insights from an intermediate pilot stage. According to Tony Baker, Director of Manufacturing Innovation at The Royal Mint: “After so many early challenges, we are pleased that it is all going to plan. In fact, we have already surpassed our 2024/25 target of 400 metric tons, when we hit 500 metric tons earlier this year. For more information: Rockwell Automation PlantPAx® The Royal Mint Precious Metal Recovery Factory Excir Previous Facebook LinkedIn Copy link Next
- Capturing Ball Speed and Spin Rates with Radar for Live Broadcasts
Baseball speed and spin rates will be integrated into app and live broadcasts and used for recruiting. Capturing Ball Speed and Spin Rates with Radar for Live Broadcasts Baseball speed and spin rates will be integrated into app and live broadcasts and used for recruiting. Joe Gillard Sports Jul 7, 2025 Stalker Sport, manufacturer of sports radar technology, has partnered with AWRE Sports, who specializes in multi-angle video, data analytics, and live streaming products for baseball and softball. The partnership will integrate Stalker’s speed and spin rate data into AWRE’s app and live broadcasts. Also from EE: Electric Race Car Uses 3D-Printed Components While speed is a well-known pitch measurement, spin rate is the measure the amount of spin, in revolutions per minute, of a baseball or softball, which can affect the trajectory of a given pitch. Stalker Sport’s Pro S line is a handheld sports radar capable of capturing this measurement in addition to speed. With the AWRE partnership, baseball teams using the Pro 3s, Pro 3, or Sport 3 Connect can connect their radar with AWRE’s system, allowing pitch data to be displayed during broadcasts and logged in AWRE’s charting system. AWRE has developed a number of sports apps for athletes, schools, sports facilities, and other users. “At Stalker Sport, we’re always looking for meaningful ways to enhance the value of our technology for players, coaches, and scouts,” said Greyson Jenista, Product Manager for Sports Tech at Stalker Sport. “Partnering with AWRE Sports allows us to bring our trusted data directly into dynamic video and live streaming environments, making performance insights more accessible and impactful than ever before.” How do sports radar guns work? Radar guns for measuring pitch speed rely on Doppler Radar. Microwaves are directed at the ball (or any moving object) and the change in frequency is measured after it bounces off. As explained by OSU professor Todd Thompson, a radar gun, such as a police radar, “bounces a pulse of microwaves (or infrared laser light) of a known wavelength off a car or truck,” and then it measures the wavelength that is reflected back towards the gun. “The Doppler shift gives the vehicle's speed.” Measuring spin rate is more difficult, and Stalker boasts having the only sports radar gun that can do it. According to the company, “measuring ball rotation requires longer tracking of the ball’s flight, followed by a complex calculation determining ball spin.” A tool for recruiting The data will also include automated highlight reels that can be shared with college recruiters and coaches, the company says. “We are constantly seeking opportunities to integrate verified data our clients desire with video. Integrating the Stalker Gun, already a very valuable tool, directly into the AWRE Charting app was a no-brainer,” said Chris Clark, CEO of AWRE. “Stalker plus AWRE now allows coaches and scouts to automatically gather, tag and organize video and verified data. This integration presents a significant advantage for coaches, scouts, players, and prospects alike.” “This integration is just the beginning,” continued Jenista. “We’re excited about the opportunities ahead to innovate alongside AWRE and continue pushing the boundaries of what’s possible in athlete development, recruiting, and fan engagement.” Source: Stalker Sport Previous Facebook LinkedIn Copy link Next
- Spatial Computing Allows Engineers and Designers to Bring 3D Designs to Life
Leveraging the power of spatial computing with manufacturing’s digital twin technology leads to more capabilities for engineering collaboration, from product design to manufacturing. Spatial Computing Allows Engineers and Designers to Bring 3D Designs to Life Leveraging the power of spatial computing with manufacturing’s digital twin technology leads to more capabilities for engineering collaboration, from product design to manufacturing. Terry Persun Cool Stuff Nov 3, 2025 From a recent release, Dassault Systèmes announced that 3D UNIV+RSES, which is powered by the 3DEXPERIENCE platform, will use spatial computing capabilities to provide a new dimension to virtual twins, with the use of the “3DLive” visionOS app. According to the release, Dassault Systèmes partnered with Apple to integrate Apple Vision Pro into the next generation 3DEXPERIENCE platform. This deep engineering-level collaboration between Dassault Systèmes and Apple has brought together the best of both platforms to deliver what Dassault Systèmes considers a magical experience. With 3DLive, virtual twins created on the 3DEXPERIENCE platform will appear to leap off the screen and into a user’s physical space, enabling real-time visualization and team collaboration in lifelike environments. Apple Vision Pro incorporates advanced cameras, sensors, and tracking to allow virtual twins to interact with the physical world around them in 3D UNIV+RSES with scientific accuracy. According to the release, this unique and powerful way to model, simulate, manufacture, train, and operate delivers value across all industry sectors and roles, enabling customers to harness the full potential of 3D UNIV+RSES and spatial computing to adapt quickly to market demand, ensure scientifically accurate product quality, accelerate workforce training, and collaborate and share knowledge and know-how. Elisa Prisner, Executive Vice President – Corporate Strategy & Platform Transformation, Dassault Systèmes is quoted as saying, “Our engineering collaboration with Apple represents a bold advance that reveals the power of 3D UNIV+RSES, where 3D is a universal language for a new world combining real and virtual. The wide and growing adoption of the 3DEXPERIENCE platform by our clients makes this cooperation a unique value for our highly diversified customer base, seeing the high potential of 3D UNIV+RSES to collaborate and train our next generation AI-based experiences on their own virtual twin data set.” Mike Rockwell, Apple’s vice president of the Vision Products Group said, “Apple Vision Pro continues to push the boundaries of what’s possible with spatial computing and is changing the way people work across key industries. We’re thrilled to be collaborating with Dassault Systèmes to supercharge the 3DEXPERIENCE platform with spatial computing capabilities that will enable engineers and designers to easily bring 3D designs to life in ways not previously possible.” Enterprise customers can download Dassault Systèmes’ new 3DLive app for Apple Vision Pro. In addition, the release said that Dassault Systèmes has released a new Apple Vision Pro app—HomeByMe Reality—that allows users to imagine, explore, and visualize home interior options from the comfort of their own home, a furniture store, or in a showroom. All images courtesy of Dassault Systems. For more information: Dassault Systèmes 3DLive HomeByMe Reality Apple Vison Pro Read more about virtual reality >>> Previous Facebook LinkedIn Copy link Next
- Printing Sushi in Space? It's Not As "Out There" As You Think
Unique dispenser technology can produce various types of sushi at the press of a button. Printing Sushi in Space? It's Not As "Out There" As You Think Unique dispenser technology can produce various types of sushi at the press of a button. Muge Deniz Meiller Cool Stuff Oct 21, 2025 The phrase “micro fluid dispensing” is generally associated with applications like medical device assembly or battery manufacturing. It certainly doesn’t conjure up visions of sushi—at least not yet. If engineers at IHI Aerospace and Yamagata University have their way, though, 3D printed sushi will be served to space tourists as they circle in low Earth orbit. IHI Aerospace is involved in developing a commercial space platform that could be used to carry civilians into orbit. The company is already looking ahead to enhancing all levels of the experience, including meals—in particular, sushi. IHI had to look beyond specialty chefs, sharp knives, and coolers of fish and seafood and decided to print the sushi with a lightweight countertop micro dispensing system. Considering that adventurers looking for the thrill of orbital spaceflight will expect an unforgettable experience which includes something more exotic than just a sandwich, IHI Aerospace reached out to Yamagata University, which has a strong aerospace engineering program and an equally well-regarded culinary arts program. After some brainstorming, the University team chose proteins in a paste form rather than as solid fish or seafood. Uni (sea urchin) and other fish pastes are common food items in Japan and many parts of the world. Thus, the concept of sushi made with uni paste is familiar. Pastes have benefits for both quality and logistics. The proteins are harvested and packed at the peak of flavor. Plus, packaged pastes are shelf stable with no leftover food waste to generate odor. Protein pastes are also compatible with non-contact micro fluid dispensing technology, making it possible to automate the sushi preparation. All photos courtesy of Nordson. The Challenges Developing printable sushi was an innovative concept and presented a number of challenges. The application required a specific volume of uni paste to be dispensed on a bed of rice in a specific pattern and location. Uni paste is a high-viscosity fluid that requires well-controlled pressure to dispense. The nozzle needed to be wide enough to discourage blockages but narrow enough to provide controlled deposition. In addition, the goal of the program was to create a system to produce four different kinds of printed sushi in paste form: uni, white fish, crab, and shrimp. The system needed to be able to toggle from one to another without flavor residue. Further, in the event of blockage, the nozzles needed to be cleanable. To tackle these challenges and build their prototype, the Yamagata University team turned to Nordson EFD Japan. By integrating a Nordson EFD PICO Pulse piezo jetting valve technology with a compact robot, the group created a precision micro fluid dispensing system capable of printing sushi that rivals products from the local sushi bar. The unit can be installed in a galley and produce various types of sushi with the press of a button. All photos courtesy of Nordson. Piezo valves are very high-resolution and reliable, with long lifetimes. These characteristics enable the user to tailor stroke length, precisely controlling the amount dispensed. This characteristic equips the PICO Pµlse jet valve to optimize deposition to achieve a uniform appearance for sushi pastes with different consistencies. The PICO Pµlse is a modular product, offering great flexibility and enabling it to be configured ideally for each application. A tool-free latch enables the fluid body to be exchanged rapidly and easily. Rapid exchanges are as useful during prototyping as they are once the product is in operation. The ability to swap out fluid carrying parts quickly allows the valve to serve its purpose of dispensing different types of pastes and being easy to clean. The IHI Aerospace/ Yamagata University team combined the PICO Pµlse valve with the Nordson EFD PICO Touch valve controller and fluid reservoir for an end-to-end solution that combined ease of integration with accurate, reliable operation. The next step was to choose the optimal nozzle to handle the protein pastes. Nordson EFD recommended a flat nozzle with a 300-micron orifice. This nozzle has a wide enough aperture to ensure smooth, controlled deposition of the protein pastes while minimizing the risk of blockages. This nozzle was covered with a special hydrophilic coating used for sticky fluids. It reduced surface tension of the wetted pathway for improved micro dispensing consistency. While printable sushi for orbital meals is an admittedly exotic use case, printable food in general could have a much broader impact. The Yamagata University team, for example, hopes to continue to explore the technology for food service in facilities like hospitals, nursing homes, and long-term care facilities. For more information: Nordson EFD Nordson Pico Pulse Valves IHI Aerospace Yamagata University Read more about food technology >>> Previous Facebook LinkedIn Copy link Next
- Virtual Reality Ride Experience
Augmented virtual reality ride helps customers be ‘in the moment.’ Virtual Reality Ride Experience Augmented virtual reality ride helps customers be ‘in the moment.’ Terry Persun Theme Parks May 21, 2025 When looking for automation challenges to solve, you don’t always have to look for industrial applications. AllMotion distributor Heitek Automation, located in Arizona, certainly thought outside the box when they helped develop and build a unique Virtual Reality (VR) experience ride for Dave & Busters, the well-known restaurant and entertainment business. Also from EE: How Jurassic World Rebirth Captured the Nostalgia of Film The VR experience was made with design cues from the Jurassic World trilogy and consists of a roller coaster-style carriage that guests sit in while wearing virtual reality googles. The ride’s carriage is supported by various actuators that lift and shift the carriage in multiple axes to provide the user with the sense of immersion during the experience. In the ride experience, the AllMotion DCH403-10 was used to drive several variable speed 24VDC fans that were wired in parallel and directed toward guests. The drive received speed command signals from a central computer that synced the action seen through the googles with the motion and airflow, providing an immersive VR experience. The DCH403-10 combines an AC to DC switching power supply with a regenerative PWM drive, creating an all-in-one solution to applications requiring control of 12 to 48 VDC motors when only 115 or 230 VAC power is available. The true lower output voltage will run your motor cooler and prolong brush life. The DCH403-10 is capable of speed or torque control, as well as cycling and positioning control when used with limit switches or resistive feedback devices. However, the microprocessor allows the drive to be programmed for custom applications or routines to meet OEMs requirements and the built-in isolation keeps PLC interfacing safe and simple. The DCH403-10 is perfect for those needing a wide range of low voltage motor control with quick braking or on-the-fly reversing when only AC line power is available. Contact: Heitek Automation: https://www.heitek.com/ AllMotion: https://www.allmotion.com American Control Electronics: https://www.americancontrolelectronics.com/ Dave & Busters: https://www.daveandbusters.com/us/en/home Previous Facebook LinkedIn Copy link Next
- High-Tech Camera and Software Capture the Speed of the French Grand Prix
To place the viewer at the center of the action, the French media company CANAL+ deployed multiple cameras to catch the raw speed of the French Grand Prix. High-Tech Camera and Software Capture the Speed of the French Grand Prix To place the viewer at the center of the action, the French media company CANAL+ deployed multiple cameras to catch the raw speed of the French Grand Prix. Blackmagic Sports Jul 22, 2025 MotoGP™, the motorcycle road racing world championship, has always tested the limits of what audiences can see and hear on screen. For its latest project, CANAL+ set out to capture not just the speed but also the quieter moments that define a race weekend. The documentary was filmed entirely with the new Blackmagic URSA Cine Immersive camera and finished in DaVinci Resolve Studio. The sports documentary is part of a new generation of immersive workflows for capture, postproduction, and viewing on Apple Vision Pro. Produced in collaboration with MotoGP and Apple, the documentary follows world champion Johann Zarco and his team during their dramatic home victory at the French Grand Prix in Le Mans. The event was captured using the URSA Cine Immersive camera with dual 8160 x 7200 (58.7 Megapixel) sensors at 90fps, delivering 3D immersive cinema content to a single file mixed with Apple Spatial Audio. The MotoGP sports experience places viewers in the heart of the action, from the pit lane and paddock to the podium. All photos courtesy of Blackmagic Design “MotoGP is made for this format,” said journalist Etienne Pidoux at CANAL+. “You feel the raw speed, and you see details you’d otherwise miss on a flat screen. It puts you closer to the machines and the team than ever before.” To place the viewer at the center of the action, CANAL+ deployed multiple URSA Cine Immersive cameras. “Two cameras were on pedestals and one on a Steadicam,” explained Pierre Maillat of CANAL+. “We needed to swap quickly between Steadicam and fixed setups depending on what was occurring at the moment. The Steadicam setup was extremely valuable,” noted Pidoux. “It made us more reactive in a fast-changing environment, giving us more agility while filming.” All photos courtesy of Blackmagic Design “Immersive video changes how you shoot,” added Pidoux. “You plan more, shoot less, and you rethink composition because of the 180 degree view, especially in tight or crowded spaces like the pit lane.” Lighting was also a consideration inside the team garages. “We added some extra light to compensate for the 90 frames per second stereoscopic capture.” Each camera was paired with an ambisonic microphone to capture first order spatial audio. The mics were then supplemented by discrete microphones for interviews and other critical sound sources. The documentary was recorded in ambisonics Format A for the immersive mix and channel based for other sources. Everything was timecoded wirelessly and synced on both the cameras and the external recorders. Post Production A portable production cart with Mac Studio running DaVinci Resolve Studio, alongside an Apple Vision Pro, was set up trackside to monitor and test shots in context. “This approach allowed us to check the content right after shooting and helped us verify framing while still on location,” said Maillat. Canal+ had a second Mac Studio running DaVinci Resolve Studio and an Apple Vision Pro set up at the hotel in Le Mans to handle media offload and backups. With 8TB of internal storage, recording directly to the Media Module, the crew could film more than two hours of 8K stereoscopic 3D immersive footage on the track without needing to change cards. Postproduction took place in Paris, where Canal+ used a Mac Studio running DaVinci Resolve Studio for editing, color grading, and audio mixing. They were even able to preview the stereoscopic timeline directly in Apple Vision Pro, which was crucial for immersive grading. Spatial Audio was mixed using DaVinci Resolve Studio’s Fairlight. “Initially, we planned to use a different digital audio workstation (DAW), but DaVinci Resolve Studio and Fairlight was the platform that gave us both creative flexibility and the high-quality deliverables for Apple Vision Pro,” explained Maillat. “Filming with the URSA Cine Immersive camera and viewing it in Apple Vision Pro, we found incredible moments we’d normally treat as background,” Pidoux concluded. “Cleaning the track, helmet close ups, the crowd, they all become part of the experience.” NOTE: The MotoGP™ Apple Immersive sports experience will be available exclusively on the CANAL+ app on Apple Vision Pro starting September 2025. For more information: Blackmagic Design MotoGP CANAL+ Previous Facebook LinkedIn Copy link Next
- Dampening Festival Noise to Create Positive Experience for Attendees and Neighbors
Although 18,000 festival goers are listening to live music for several days, neighbors needed the noise limited so that they could go on with their normal lives. Dampening Festival Noise to Create Positive Experience for Attendees and Neighbors Although 18,000 festival goers are listening to live music for several days, neighbors needed the noise limited so that they could go on with their normal lives. Edited by Terry Persun Stage Events Oct 17, 2025 Once a year deep in the heart of a Lincolnshire wood, with its winding pathways, derelict buildings, abandoned cars, deserted junkyards, old relics, and fairy lights, 18,000 revellers gather to immerse themselves in four days of live music, arts, performance, food, culture, wellness, and relaxation. Welcome to the Lost Village where music and festivities continue until 2:00 am. At one of the Lost Village events the sponsors—with the help of Three Spires Acoustics—needed to find a new approach in dealing with the night-time music noise limits to provide the festies with a good night out and the local community with a good night’s sleep. The woods are abandoned most of the year, but for four days in summer, a secluded woodland near the village of Norton Disney, Lincolnshire, comes to life when partygoers, DJs, artists, and the beat of dance music pervade the bucolic environment throughout the day and into the night. However, at 11.00 pm sharp, regulatory night-time noise level limits, imposed by the local authority, come into force and the volume is turned down. Imagine the disappointment and the dissatisfaction for both music fans and artists alike where a worst-case scenario could result in crowd management issues. The Right Balance For festivals such as Lost Village noise limits pose the challenge of keeping the right balance between the optimal concert sound and reduced noise in the surrounding environment. Keeping noise levels below prescribed limits is also essential to maintain a Premises Licence (permit to operate) and gaining local community buy-in. To get the balance right, Lost Village founder Andy George has been working closely with Three Spires Acoustics, an independent and leading consultancy involved with event noise management and regulatory control. Specializing in services and solutions for a diverse client base, they assess, resolve, and manage noise and pollution issues for, among others, entertainment venues and outdoor concerts. One of the main causes of discontent has always been the hard level reduction at 11:00 pm, which can result in a significant decrease of allowable offsite noise levels of up to 20 dB. They needed to find a way of overcoming this issue, while remaining compliant with regulatory requirements. An Innovative Approach Lost Village and Three Spires Acoustics came up with a simple but innovative approach—to apply a staggered reduction in sound levels between 11:00 pm and midnight. The solution was only made possible thanks to the flexibility and support of the local authority, North Kesteven District Council, and the use of technological advances in hardware including B&K 2245 sound level meters with Enviro Noise Partner (a complete, focused toolset for environmental noise measurements), combined with Noisy’s noise monitoring platform, integrated for use with B&K 2245 via the sound level meter’s open application interface (API). Once installed, the fully integrated Noisy platform allowed all stages to be monitored at front of house (FOH) or side of house (SOH) positions and provided a central control point displaying all onsite stage sound levels (LAeq and LCeq), along with three permanent offsite monitoring stations (connected via 4G router). Real-time monitoring enabled the engineers to follow, prevent, and correct the acoustic impact of internal sound and external noise levels and manage the staggered level reduction while remaining compliant with condition requirements. Reducing the sound levels gradually at each of the seven main stages discreetly acclimatized the audience to the lower limits over a period of one hour, making the shift/change in volume less dramatic than the step change of previous years. One of the main advantages of the Noisy platform is that it can accommodate SOH or back of house (BOH) mixer desk locations, by locating B&K 2245 sound level meters remotely at the back of a big top stage and Noisy tablet displays at SOH or BOH positions, both connected over a managed network. Traditionally, this was not possible without hard connection using long lengths of XLR cable. Power over Internet (POE) for both sound level meter and Noisy tablet displays also makes the system much more robust. The ability to schedule different parameters by stage and time and make on the spot changes reacting to off-site readings was invaluable. Satisfied audience and artists, full regulatory compliance, and very few noise complaints made the new approach a huge success. Although the Lost Village is located in a dense wood, high technology, flexibility, innovation, and reliable and efficient digital tools were crucial to the success of the event. Photos: Copyright © Lost Village. Thank you to Chris Hurst at Three Spires Acoustics for his help with this article. For more information: Hottinger Bruel & Kjaer Lost Village Three Spires Acoustics Noisy Software B&K 2245 Sound Level Meters Read more about concert technology >>> Previous Facebook LinkedIn Copy link Next












