Search
135 results found with an empty search
- These Robot Bugs Inspired by Nature Mimic the Real Thing
Creating new designs includes a wide view of the possibilities long before nailing down the details. These Robot Bugs Inspired by Nature Mimic the Real Thing Creating new designs includes a wide view of the possibilities long before nailing down the details. Terry Persun Museums Sep 23, 2025 Cool Stuff Entertainment Engineering Magazine is based on the concept of technology transfer where an engineer reading an article about one industry will instantly transfer that information to whatever project or projects they might be involved with in other industries. We’ve also heard this called cross-industry innovation. Well, Festo has been doing something similar only using nature as the crossover point and then cross-industry innovation. It’s pretty cool and we thought we’d bring some of their thinking to you. In a talk given by Dr. Elias Knubben, Head of corporate bionic projects at Festo, we learn some important elements about how creativity is being used to advance automation. EE pulled some of the key points from a video produced by Wired UK. Here’s what we got from the talk: Festo has a small team that explores technology through the window of nature. They’ve created butterflies and dragonflies, but also ants that work together as a team, and have recreated what resembles the tongue of a lizard. Many of the capabilities designed using nature as a model have seen their way into industrial products including control systems and robot end effectors. To create innovative products, the small team that works with Dr. Knubben goes for concept first. They look for something innovative, fascinating, educational, and inspiring. These elements are necessary so that the project gets everyone’s attention, gets everyone involved. Each member must bring confidence and the courage to fail to the project even as early prototypes are created, even when they are requesting funding. Ultimately, though, the risk is shared. From this point, the whole team can come together. By working in interdisciplinary teams that literally work side by side, each person can come at a possible solution from a different angle. Using biological role models for inspiration, the team gets started as quickly and easily as possible. Once started they are able to incubate their ideas and then put several methods into place at the same time. During the project phase, Dr. Knubben’s team constantly zooms in and out to come up with iterations and variants on a particular project. To do this, Festo incorporates generalists to keep an eye on the big picture and specialists to dive deeply into the technical details such as creating the circuit boards, writing code etc. Together, the team has the freedom and the playground to work on any project. In the end, the results are not simply to mimic nature, like an ant or kangaroo but the algorithms used to get there, the principles of operation that are now available for other industrial automation devices. Because Festo is involved with a wide variety of industries such as electronics, automotive, life sciences, process control, food processing, and others, the company is continually employing cross-industry innovation to move their products into the future. Much of the innovation for this forward motion comes from nature. Something Fun A fun display created by Festo for a recent tradeshow uses a Rube Goldberg approach to take viewers through the history of automation. Rube Goldberg takes a simple process and represents it using a complex and convoluted series of chain reactions. Called the Incredible Machine, the Festo display starts with moving the wings of a butterfly as a metaphor for the butterfly effect: the concept in chaos theory where a small change in conditions leads to vastly different outcomes in a complex system. Watch the video below and see just how significant one small event can become by moving through a variety of technologies in various industries only to end similar to how it began. *All photos courtesy of Festo. For information: Festo: https://www.festo.com/us/en/ Adaptive Shape Gripper: https://www.festo.com/us/en/p/adaptive-shape-gripper-id_DHEF/?q=robotic%20grippers~:festoSortOrderScored Previous Facebook LinkedIn Copy link Next
- Formula 1 Wings Use Mini Solenoid Valves
The fast-moving world of Formula 1 motor racing technology is constantly changing as a result of the desire to gain an advantage in such a highly competitive arena, as well as the need to comply with ever-changing rules and regulations. Formula 1 Wings Use Mini Solenoid Valves The fast-moving world of Formula 1 motor racing technology is constantly changing as a result of the desire to gain an advantage in such a highly competitive arena, as well as the need to comply with ever-changing rules and regulations. Edited by EE Staff Sports Sep 11, 2025 After being banned for 40 years on safety grounds, in 2011 driver-activated rear wings were returned to Formula 1. This decision was made at a meeting of the sports governing body, the FIA World Motor Sport Council, with the objective of allowing more overtaking in F1. One of the beneficiaries of this decision was The Lee Company, which has sold hundreds of miniature solenoid valves to all of the F1 teams for a wide range of applications, including fuel flaps, emergency clutch disengagement, reverse gear selection, power steering, auxiliary lube oil tank top-up applications, and front wing control. In fact, The Lee Company specifically developed its Performance Racing Solenoid Valve at the request of its F1 customers, and this was subsequently homologated by the FIA for use in F1. In the video below, Marc Priestley builds a scale(ish) model to explain exactly how DRS works in F1. Marc 'Elvis' Priestley worked for McLaren Racing as a Formula One mechanic and member of the pitstop crew from 2000-2009. He worked with a distinguished list of drivers including Mika Hakkinen, David Coulthard, Kimi Raikkonen, Jean Alesi, Juan Pablo Montoya, Fernando Alonso and Lewis Hamilton. For the real F1, the active front wings have a trailing edge which can be adjusted by the driver to increase downforce during braking. Active front wings were a feature on Formula 1 cars for the 2010 season but were banned for the 2011 season, to be replaced by the active rear wing. Drivers are now able to control the rear wing once they have been behind another driver for a set amount of time, thereby reducing drag on the straightaway and allowing greater opportunity for overtaking. The high-flow 3-way single-coil solenoid valve. All photos courtesy of The Lee Company. The wings are raised and lowered by a large hydraulic actuator, which must move very quickly and therefore demands a fast-acting, high-flow valve to control it. The Lee Company developed a 12 VDC version of its 3-way high flow solenoid valve specifically for this purpose, and several Formula 1 teams have specified the valve for this function. This modified valve can operate at temperatures of up to 329°F (165°C). Also, the retaining nut has been removed to save space and reduce weight, and the lead wires now exit from the rear of the valve, instead of at the hydraulic end. The Lee Company’s miniature 3-way high flow solenoid valves are a natural evolution of our proven piloting solenoid valves, which set new standards in reducing space, weight, and power consumption. Valve elements are based on the low leakage, highly reliable designs used in Lee check and shuttle valves. Functions and features include two-position, 3-way, 3000 psi, 4.0 GPM minimum flow at 3000 psid (300 Lohms), single-coil, and an integral safety screen with a 0.004 inch hole size. Power consumption is 7.8 W at 18-32 VDC, and the valves can operate within a temperature range of -129°F to 275°F (-54°C to 135°C). These miniature solenoid valves are part of The Lee Company’s extensive range of plugs, restrictors, check valves, relief valves, shuttle valves, filter screens, and flow controls, which are all being used in current Formula 1 cars and engines. Formula 1 and F1 are trademarks of Formula One Licensing BV, a Formula One Group Company. For more information: The Lee Company The Lee Company Performance Racing Solenoid Valve Marc Priestley F1 Elvis Previous Facebook LinkedIn Copy link Next
- Drive Solutions for Arcade Machines
Amusement machines require smooth, quiet, and reliable operation. Here are the motors that make that happen. Drive Solutions for Arcade Machines Amusement machines require smooth, quiet, and reliable operation. Here are the motors that make that happen. Edited by EE Staff Mini Story Nov 3, 2025 If you’re looking for field-proven solutions for a range of amusement machines, consider the company that provides motors for a wide range of coin pushers and claw machines. Parvalux motors have been specified by amusement machine and arcade equipment manufacturers with machines found in over 80 countries. Key factors for these types of machines include smooth, quiet and reliable operation aimed at providing the best user experience. Claw machines need to offer smooth, controllable operation and coin-drop machines a similarly smooth function. These machines require component that require minimal maintenance to reduce downtime, and lower total cost of ownership. For unique applications, the company also offers semi-custom and custom drive solutions. To help you select the right modular motor, Parvalux has an easy to use selection tool available. Check it out here . Previous Facebook LinkedIn Copy link Next
- K-Tek TadpoleX Holds GoPro
Video camera and audio accessory boom poles K-Tek TadpoleX Holds GoPro Video camera and audio accessory boom poles EE Staff (pub. 2013) Cool Stuff Jun 4, 2025 K-Tek makes video camera and audio accessories boom poles and microphone support products. Tadpoles have become the popular solution for holding small HD action cameras. The newest member of the line, the TadpoleX, is made of black anodized aluminum. The unit’s three-section camera pole is engineered with the same precision as K-Tek’s market-leading boom microphone poles. With a fixed ¼-20 threaded stud mount, the TadpoleX is ideal for small cameras such as the GoPro. Plus, weighing just 8 ounces, the camera and accessory pole can extend up to 3 feet, 7 inches, yet collapses to an easily storable length of 19 inches. Designed for efficiency, the TadpoleX is quick and easy to use in an extreme sports environment. The sections securely lock in place with a simple twist. The pole also features a padded handgrip, made of closed-cell vinyl, to help insulate and cushion vibration. In addition, the comfortable-to-wear woven wrist lanyard cord, also adjustable, ensures that the camera attached to the K-Tek TadpoleX is secure. For more information: K-Tek Home Previous Facebook LinkedIn Copy link Next
- The James Webb Space Telescope
Here’s a look at some of the challenges overcome to build the remarkable space science observatory The James Webb Space Telescope Here’s a look at some of the challenges overcome to build the remarkable space science observatory Joe Gillard Cool Stuff Jun 7, 2025 In July of 2023, the James Webb Space Telescope (JWST) marked its one-year anniversary on-station a million miles (1.5 million km) from Earth. Its breathtaking imagery, derived from infrared (IR) wavelengths, shows galaxies at the farthest reaches of space and time, future planetary systems in the making, and previously invisible details of Earth’s solar system neighbors. Origins, challenges The JWST, an international program led by NASA, was conceived in 1988 as a follow-on to the Hubble Space Telescope. To see farther than Hubble, the new observatory would have to use IR light. It required a lens more than 20ft (6.5m) across, and a sunshield the size of a tennis court to protect it from heat sources that would overwhelm the faint IR signatures of distant stars. In operation, these structures are too large to fit atop any rocket. The lens and sunshield would have to survive launch, then unfold flawlessly to be reassembled in space. Photo: Mirror (left) and an image taken by the James Webb Space Telescope (right) Mirror The mirror collecting the IR light is composed of 18 hexagonal segments, each nearly five feet across. On orbit, small motors carefully realign the segments into a honeycomb surface. Northrop Grumman was the prime contractor for the JWST. In 2004, the contractor responsible for the mirror, Axsys Technologies, selected Mitsui Seiki to supply the machine tools needed to produce the mirror segments. The president of Mitsui Seiki USA at the time, Scott Walker, recalls the challenges of building eight Mitsui Seiki HS6A horizontal machining centers, with near-identical tolerances, to manufacture the mirror segments. “The [machines’] axes are straight to 2µm, and the perpendicularity is within 3µm over 1m. This is remarkable for machines of this size.” For comparison, 2µm (0.002mm) equals 0.00007874″, about 1/50 the thickness of a human hair. The 18 mirror segments would be machined from billets of cast beryllium – a light, stiff, strong, thermally stable metal with some toxicity – “and very difficult to polish into the perfect shape,” Walker says. Each billet weighed 700 lb (318kg), reduced to 28 lb (13kg) by the end of 18 weeks of machining time. The mirror surface’s specified thickness was 0.098″ ±0.003″ (2.49mm ±0.076mm). The 600 pockets cut into the back of each mirror panel featured eight different rib segments, 0.021″ to 0.2″ (0.53mm to 5mm) thick. The experience had lessons for future Mitsui Seiki machines. “What we learned on those big horizontal 4-axis machines was how to build accurate, big 5-axis machines,” Walker says. “I don’t think we could have successfully built a big 1m or 2m trunnion machine, and 20µm in the cube, without doing those eight machines.” Sunshield “We don’t want the telescope to glow brighter than the stars it’s looking at, so the telescope has to be cold – only 55°F above absolute zero or -361°F (-218°C),” says Mike Menzel, the NASA mission systems engineer for the JWST at the Goddard Space Flight Center How do you get 3 metric tons of telescope to that temperature? The first part is putting it one1 million miles away, a place where the Earth, sun, and moon can be behind the spacecraft. “We’ll build a big umbrella called the sunshield that’ll keep the telescope in the shade, and the telescope will naturally cool down to that temperature,” Menzel explains. The sunshield is illuminated by 200,000W of solar radiation, but it must only transmit 0.02W. “I tell people if it were suntan lotion, it would have a sun protection factor (SPF) of 10 million.” The sunshield is abou 69 feet x 49 feet And it must fit into an 18-foot-diameter launcher. The only way to do that is to fold it up. “Folding up the sunshield was one of the banes of my existence, because now it has to unfold in orbit,” Menzel says. “When I started in this business in 1981, the very first thing I learned was never deploy anything in space. Something always goes wrong.” “My colleagues are all standing around, thinking of how this thing needs to get out there, unfold, and work. And I’m thinking, I’m either going to have a job, or I’m not going to have a job or ever be able to get another one.” During ground testing, every time the delicate sunshield was unfolded, it took about three months to fold it back up, requiring three-story tall cherry pickers to accomplish the repacking. The development program for the process took about three years. The JWST required 50 of the most complex deployments ever attempted. Once the JWST reached space, the solar panel array deployed to start putting electrical power back into the batteries used during launch. After the JWST passed the moon’s orbit, two big panels which carry the folded-up sunshield were rotated. All the deployments were controlled from the ground and took 14 days. “We released 107 actuators that allow roll-up covers to unfold the sunshield,” Menzel states. Complicating matters, the sunshield is composed of five individual layers of reflective material, made of specially coated Kapton. Individual pulley systems tighten up the layers into the correct trapezoidal shape. Each layer must be the correct distance from each other, and extend fully to properly isolate the telescope from stray IR radiation. Next, the telescope’s secondary mirror must move into position, and the wings of the main mirror rotate from their stored position and lock in place. “All the deployments went successfully: 344 single-point failure items all had to work correctly for this to happen,” Menzel says. The cool-down period went correctly, and the telescope was aligned. “We literally had to rebuild that telescope and realign it on-orbit. That has never been done before,” Menzel says. By April 2022, all the instruments were working, and the first images were revealed in July. “The first image I saw was a galaxy cluster about 4 billion light years from us. But so many objects in it are about 13 billion light years away,” Menzel recalls. It took Hubble 14 days to gather the images to assemble a similar image. This JWST image required only 12 hours. After he saw that first image, Menzel knew he had a job for the next 20 years. He’s confident the JWST will be operational that long because the rocket put it exactly on target at L2. “We put 10 years of maneuvering fuel on it, but because we had such a great launch, the fuel will last for 20 years,” Menzel says proudly. The telescope is working twice as good as it’s supposed to, Menzel says. “We’re supposed to be diffraction limited at a wavelength of 2µm. We’re currently diffraction limited at 1µm, meaning the total error on that telescope is only about 1µm divided by 14.” Goals What can the JWST accomplish? Menzel lists the project’s goals: “When this started, we had four science objectives. First, we want to see the first light that turned on in the universe, which we believe began about 13.7 billion years ago, right after the big bang. There was a dark period where there’s no light, but maybe about 400 million years after that, the first stars came and we want to see them.” “Next, we want to see how galaxies evolve through cosmic time. When you look at the galaxies very far away in the universe, they look like blobs. We want to see how those blobs evolve into spirals or other structures. We want to see how stars are born in our own Milky Way galaxy. And finally, we want to see how solar systems are born and how they evolve to make planets.” “With a year of science under our belts, we know exactly how powerful this telescope is, and have delivered a year of spectacular data and discoveries,” says Webb Senior Project Scientist Jane Rigby of the Goddard Space Flight Center. “We’ve selected an ambitious set of observations for year two that builds on everything we’ve learned so far. Webb’s science mission is just getting started – there’s so much more to come.” Originally published in Aerospace Manufacturing and Design Previous Facebook LinkedIn Copy link Next
- How Did Imagineers Make This Animatronic Walt Disney So Lifelike?
Walt Disney himself is "brought to life," leaving spectators astonished. How Did Imagineers Make This Animatronic Walt Disney So Lifelike? Walt Disney himself is "brought to life," leaving spectators astonished. Joe Gillard Theme Parks Jul 22, 2025 Disneyland Park on July 17, 1955, and Disney is celebrating the 70 th anniversary with some show-stopping technology from the engineers at Walt Disney Imagineering. A new park attraction based on Walt Disney, the man himself, called “Walt Disney – A Magical Life,” recently debuted at the Main Street Opera House in Disneyland last week. Disney says “guests will first experience a cinematic journey (approximately 15 minutes) through the film, “One Man’s Dream,” culminating in a visit with Walt in his office, made possible through the magic of Audio-Animatronics ® storytelling.” Also from EE: Wicked Technology Defies Gravity on Broadway Animatronic Walt Disney leaves commentators stunned Perhaps the main draw of this new attraction is a lifelike, animatronic Walt Disney that moves, talks, and walks. The park says “guests will hear heartfelt stories, anecdotes and words of wisdom shared by Walt using historical recordings.” Walt Disney Imagineering has been working on this project for seven years, and Disney says the idea for doing something like this goes back even further. The dedication shows, apparently. ”Seated in the audience, I couldn’t figure out exactly how it works.” writes Jacob Krol of TechRadar. “There’s no visible harness or backing, not even leads from the desk. This is the first entirely electric figure to complete that lean-to-stand motion fully.” He reports that the Imagineering team did deep research on the human facial features and movements, right down to the cornea bulge of the eye. The level of detail and care that went into the research and the project left many feeling that Disney had done right by the man and his family. “The expertise and care that Walt Disney Imagineering has devoted to this project is nothing short of remarkable,” said Kirsten Komoroske, Executive Director, The Walt Disney Family Museum. “I think that Walt would be thrilled with the blend of cutting-edge technology and artistry. And I think that he would be touched by the tribute.” Technology So, how exactly did the Imagineering team achieve what they’re calling “the most lifelike figure that Walt Disney Imagineering has ever created?” It’s hard to find anything about specifics, only that the team referred to their innovations as “moonshots.” Audio-Animatronics is the trademarked name of the mechatronic animatronic technology familiar to anyone who has been to Disneyland. They generally feature movement synchronized with external audio (think Pirates of the Caribbean). This Walt Disney attraction is the latest iteration of that, and one Disney seems particularly proud of. The puppeteering works through a combination of motors, fluid power, solenoids, and cables. These mechanics are combined with programming to make sure everything works harmoniously. “Many of Disney’s Audio-Animatronics figures are designed to move in concert,” says Disney, “with choreographed movements timed by complex audio cues and digital signals. These movements require extensive engineering — courtesy of Disney Imagineers — and programming to create a seamless display of characters in action.” The system of the animatronics relies on a combination of electric motors, solenoids, hydraulic systems , pneumatic systems, and cables to produce repeatable puppet movements that syncs to sound.[5] The animatronic Walt is “the first ‘lean to stand’ motion for an all-electric figure,” according to Disney, and press material points to a ‘twinkle in the eye,’ a focus on “muscle structure and the nuances of speaking and gesturing such as how the mouth falls when the figure is speaking, as well as the very Walt mannerisms and movements including the way Walt used his hands.” Typical of Disney, there was a commitment to detail. “We worked closely with the Walt Disney Archives and The Walt Disney Family Museum to depict the details of Walt and his office accurately in this theatrical presentation,” said Jeff Shaver-Moskowitz, Portfolio Executive Creative Producer, Walt Disney Imagineering. “Most importantly, we were passionate about creating an Audio-Animatronics figure designed specifically for this attraction, delivering a portrayal that has his nuances, hand gestures, facial expressions, and more- all of the attributes that make this figure’s performance feel uniquely Walt and not simply creating a figure to look like Walt.” See below for a review from a former Imagineer: For more information, visit Walt Disney: A Magical Life . Previous Facebook LinkedIn Copy link Next
- Hi-Tech Cameras Used to Shoot TV Drama
Japanese drama uses multiple Blackmagic cameras and DaVinci Resolve Studio for grading. Hi-Tech Cameras Used to Shoot TV Drama Japanese drama uses multiple Blackmagic cameras and DaVinci Resolve Studio for grading. Terry Persun Film and TV Jun 19, 2025 Blackmagic Design recently announced that the ABC TV and TV Asahi network drama series “It's a Wonderful Teacher!” was shot on Blackmagic URSA Mini Pro 12K and Blackmagic Pocket Cinema Camera 6K Pro digital film cameras. Additionally, DaVinci Resolve Studio was used for grading the series. “It's a Wonderful Teacher!” is a television drama produced by ABC TV starring Erika Ikuta. The series depicts the struggles of a young high school teacher, Rio Sasaoka, from Generation Z. In the show, Rio experiences so much stress in her second year of teaching that she considers quitting. However, by addressing challenges in modern education and building relationships with her students, she gradually grows as an educator. According to cinematographer Akiyoshi Konno, who was in charge of shooting the series, "Since the story is set in a high school, there are more than 30 cast members in the classroom scenes. Because of this, we used a multi camera setup. The main camera was the URSA Mini Pro 12K, while the B and C cameras were Pocket Cinema Camera 6K Pros. We divided the classroom in half, first shooting one side of the students and then moving to the other." Konno continued, "It's rare to be able to shoot a TV drama in RAW in Japan, but since the goal was to achieve a cinematic look, I decided to shoot in 24P Blackmagic RAW. I personally own Blackmagic Design cameras, so I was already familiar with their ease-of-use and excellent color quality. Additionally, for a multi camera shoot, the Pocket Cinema Camera was an affordable option, which was also a key factor in choosing it." The series was filmed using SIGMA 18-35mm and 50-100mm cinema lenses. Practical lighting techniques, where light sources are visible within the frame, were also incorporated into the production. DaVinci Resolve Studio was used for grading. After completing the first episode, Konno attended the grading sessions to discuss skin tone adjustments and color emphasis. “Additionally, I created a LUT combining ARRI and Kodak LUTs in DaVinci Resolve and used it during the shoot," said Konno. Konno said that one of his favorite features of Blackmagic Design cameras is their easy-to-use false color function. “Given the time constrains of a TV drama production, it was incredibly useful for fine tuning colors and monitoring lighting levels,” he concluded. For information: Blackmagic Design: https://www.blackmagicdesign.com/ DaVinci Resolve: https://www.blackmagicdesign.com/products/davinciresolve “It’s a Wonderful Teacher: https://japan-programcatalog.com/en/program/itsawonderfulteacher Previous Facebook LinkedIn Copy link Next
- Brain-Controlled Flight
Simulating brain-controlled flying at the Institute for Flight System Dynamics Brain-Controlled Flight Simulating brain-controlled flying at the Institute for Flight System Dynamics EE Staff Cool Stuff Jun 4, 2025 Pilots of the future could be able to control their aircraft by merely thinking commands. Scientists of the Technische Universität München and the TU Berlin have now demonstrated the feasibility of flying via brain control – with astonishing accuracy. The pilot wears a white cap with myriad attached cables. His gaze is concentrated on the runway ahead of him. All of a sudden the control stick starts to move. The airplane banks and then approaches straight on toward the runway. The position of the plane is corrected time and again until the landing gear gently touches down. During the entire maneuver the pilot touches neither pedals nor controls. "A long-term vision of the project is to make flying accessible to more people," explains aerospace engineer Tim Fricke, who heads the project at TUM. "With brain control, flying could become easier. This would reduce the workload of pilots and thereby increase safety. In addition, pilots would have more freedom of movement to manage other manual tasks in the cockpit." The scientists have logged their first breakthrough: They succeeded in demonstrating that brain-controlled flight is indeed possible, with amazing precision. Seven subjects took part in the flight simulator tests. They had varying levels of flight experience, including one person without any practical cockpit experience whatsoever. The accuracy with which the test subjects stayed on course by merely thinking commands would have sufficed, in part, to fulfill the requirements of a flying license test. "One of the subjects was able to follow eight out of ten target headings with a deviation of only 10 degrees," reports Fricke. Several of the subjects also managed the landing approach under poor visibility. One test pilot even landed within only few meters of the centerline. The TU München scientists are now focusing in particular on the question of how the requirements for the control system and flight dynamics need to be altered to accommodate the new control method. Normally, pilots feel resistance in steering and must exert significant force when the loads induced on the aircraft become too large. This feedback is missing when using brain control. The researchers are thus looking for alternative methods of feedback to signal when the envelope is pushed too hard, for example. In order for humans and machines to communicate, brain waves of the pilots are measured using electroencephalography (EEG) electrodes connected to a cap. An algorithm developed by scientists from Physiological Parameters for Adaptation of the Technische Universität Berlin allows the program to decipher electrical potentials and convert them into useful control commands. Only the very clearly defined electrical brain impulses required for control are recognized by the brain-computer interface. For more information: Technische Universität Münche Previous Facebook LinkedIn Copy link Next
- TED2014 Uses DPA Microphones
McCune Audio, full-service technical event specialists, rely on classic 4088s and discreet 4060s to mic guests and props for the renowned conference. TED2014 Uses DPA Microphones McCune Audio, full-service technical event specialists, rely on classic 4088s and discreet 4060s to mic guests and props for the renowned conference. McCune Audio Stage Events Jun 4, 2025 Heard by thousands of people through live audiences, simulcasts, webcasts, and recordings, and millions more through online video engines, the annual TED Conference requires gear that matches its prestige. With discussions that address a wide range of topics within the research and practice of science and culture, microphones are vital to all TED audiences. To provide the ultimate in sound quality for presenters and performers at TED2014, technical team from McCune Audio/Video, the symposium ’s sound services company, relies on DPA Microphones’ classic 4088 Directional Headset Microphones and d:screet 4060 Omnidirectional Microphone. TED2014 marks the second time that DPA Microphones is part of the audio setup for the conference. The McCune Audio team used the same selection of mics at last year’s event because of the high-quality audio of both mics and comfortable fit and feel of the 4088, each requests from production crew workers and presenters, respectively. “For years, the post-production crew was trying to get us to reduce the amount of room acoustics we were hearing in the recordings,” says Nick Malgieri, McCune’s Head of Audio for the TED Conference. “Because of the live PA system in the room, there was a slap-back echo always happening in the voice. We really used some pretty extreme processing paths with our previous mics just to try to help us get ahead of the issues. Once we switched to DPA 4088s, the added isolation let us really scale back on the processing. The sound was so much better that the post-production crew asked us to deploy ambient mics to capture the energy of the venue. That led us to the d:screet 4060s, which we use to pick up a lot of the stage noise from the props and scenery on set, such as chalkboards during presentations, a fire organ for a pyrophone performance or a target being hit by an archer.” The versatility of the DPA mics also plays an important role in the audio support at the TED Conference, which hosts 12 90-minute shows over the course of five days. Throughout each of these individual productions, Malgieri and his crew need to mic as many as 10 separate presenters speaking for up to 18-minute segments each. In these instances, the audio team uses the 4088s. The shows also incorporate other types of presentations, such as musical, theatrical or dance performances, or magical acts, which require the 4060s. “The technology setup for this show is very important because of the high turnover and the way the show is actually produced,” explains Malgieri. “We are doing three shows a day, with only an hour or so to reset between them, so we need high-end gear that can keep up with the constant demand and rigorous use.” As one of DPA’s most popular products, the 4088 is ideal for a variety of performing and vocal presentations. It is designed for acoustically demanding live performance environments, where background noise and feedback is a concern, and boasts the same open and natural sound qualities of the company’s other legendary microphones. Originally designed for use with wireless systems in theater, television and close-miked instrument applications, the d:screet 4060 capsule is highly unobtrusive. Because of its small size, this tiny condenser mic exhibits an exceedingly accurate omnidirectional pattern, and therefore does not need to be aimed directly at the sound source to achieve quality pickup. For more information, visit DPA Microphones . Previous Facebook LinkedIn Copy link Next
- Mapping the Seafloor for Underwater Explorations
Acoustic echosounder simultaneously collects bathymetric, seafloor backscatter, and water column backscatter data to identify seafloor and water column features. Mapping the Seafloor for Underwater Explorations Acoustic echosounder simultaneously collects bathymetric, seafloor backscatter, and water column backscatter data to identify seafloor and water column features. Edited by EE Staff Cool Stuff Sep 9, 2025 Header image caption: High-resolution seafloor mapping revealed unusual pancake-like features of a seamount in the Moonless Mountains chain in the Eastern Pacific. To plan efficient and safe operations, Nautilus (sea exploration ship) often creates its own seafloor maps—particularly when exploring little-known regions of the ocean. To facilitate this operation, the ship incorporates various equipment that provides high-quality seafloor maps at depths to 7,000 meters (23,000 feet). Whether focused on a canyon, seamount, or shipwreck, creating a map allows the crew to identify potential targets, cutting down exploration time and boosting mission efficiency. Before ROVs are deployed, the team must first map the area to understand the characteristics of the region and identify potential benthic habitats, seeps, and other environments and resources worthy of exploration. In addition to informing dive objectives, Nautilus transit routes cover unmapped areas of the ocean and contribute to the Seabed 2030 initiative, an international collaborative project to combine all bathymetric data in order to create a comprehensive map of the ocean floor. Nautilus. All images courtesy of Ocean Exploration Trust. Multibeam Echosounder Mounted on the hull of the vessel is a Kongsberg EM302 multibeam echosounder capable of accurately producing state-of-the-art maps covering large areas of the seafloor. The echosounder maps the seafloor at depths between 50 and 7,000 meters (300 to 23,000 feet) while cruising at ship speeds up to 12 knots (14 mph). The transmit array emits acoustic pulses that ensonify the seafloor with a wide fan-shaped swath of sound, while a second transducer receives the return signal echoes. Each pulse sends many beams of sound in a fan shape toward the seafloor. When these pulses strike the seafloor and return to the transducer/receiver combination mounted on the hull of the ship, the system computes a “sounding” associated with each returning pulse via the time it took to travel down and up through the water column. Because the ship is moving between the transmit and receive functions, a motion sensor connected to the system allows the echosounder to “steer” the sound pulses to correct for the ship’s rolling and swaying motions. This allows the ship to collect an even distribution of data from the seafloor. Received soundings are combined with the ship’s Global Navigation Satellite System (GNSS) information, to produce a grid or “digital elevation model” of ocean bathymetry—essentially a topographic map of the seafloor. Images such as those from Google Earth and other satellites offer very little modern depth observations and only provide general highs and lows of deep-sea topography. A depiction of the seafloor using satellite data (left) and after a Nautilus pass (right) with multibeam sonar data processed in QPS Qimera. All images courtesy of Ocean Exploration Trust. The multibeam echosounder acoustically “sees” different scales and resolutions at different depths. When Nautilus is mapping, the multibeam sonar fan covers a different width (scale) on the seafloor depending on the depth, however the number of measurements across the swath of the fan remains the same. In shallow water, the soundings are closer together delivering many details of the seafloor in a small area (higher resolution data). In deeper water, fewer details are available (lower resolution) but the multibeam fan of soundings covers a much wider area. In addition to the depth, the signal strength that the sonar receives back from the seafloor (“backscatter”) will be different depending on the type of seafloor that reflects the ping. By making corrections to this signal to account for the changes as it went through the seawater from the ship and back, the processing can extract information to indicate variations in the seafloor type. Reflections from rocky seafloor will generally provide a stronger signal than a muddy area. Backscatter measurements are then combined in another grid called a backscatter mosaic, which can be combined with the bathymetry grid to provide a better understanding of the shape and seafloor type. The multibeam echosounder can also detect phenomena in the water column, such as plumes of bubbles emanating from the seafloor that indicate gas seeps. To date, the Nautilus has documented thousands of methane seeps along the Cascadia Margin off the Oregon and Washington coast. Sound waves reflect strongly off gas bubbles emanating from the seafloor. All images courtesy of Ocean Exploration Trust. Exploring Sub-surface Faults Revealing structures below the seabed is just as important as discovering the seascape and habitats above. To complement the multibeam mapping work, the team uses a Knudsen 3260 sub-bottom profiler and echosounder. Mounted inside the hull of Nautilus , the echosounder operates at low frequencies to penetrate and reflect off of the layers of sediment, revealing a cross-section of the seafloor structure. The dual-frequency profiler operates at 3.5Khz or 15Khz (two discrete channels with separate transducers) and is capable of full ocean depth soundings. An acoustic pulse is directed through the water column to the seafloor and then captured by the system as it bounces back from each layer. Scientists use this data to identify subsurface geological structures such as faults, ancient channels, and buried levees. In early 2023, Ocean Exploration Trust installed a Kongsberg Simrad EC150-3C 150 kHz transducer on E/V Nautilus . Mounted within the ship's hull, the EC150-3C is the first of its kind to combine an acoustic Doppler current profiler (ADCP) and an EK80 split-beam fisheries sonar into one instrument. The ADCP, which measures the speed and direction of currents at various depths underneath the ship supports safe remotely-operated vehicle (ROV) operations and provides data for improving oceanographic current models. The integrated split-beam echosounder maps and characterize features found within the water column, such as biology, scattering layers, and potentially bubble plumes. The EC150 will equip E/V Nautilus with the capability to better serve as an operations hub for multi-vehicle operations, increase OET’s capacity to explore and map the water column, and to collaborate with partners from the Ocean Exploration Cooperative Institute to advance combined robotics and new technologies to increase and advance the pace of ocean exploration. For more information: Ocean Exploration Trust & Nautilus Live Kongsberg Simrad EC150-C Seabed 2030 Qimera Previous Facebook LinkedIn Copy link Next
- Disney Short Film Brings Augmented Reality Into Your Living Room
This original short film, Remembering, was produced using virtual production techniques and a companion Augmented Reality experience to extend the film beyond the screen and into the audience’s home. Disney Short Film Brings Augmented Reality Into Your Living Room This original short film, Remembering, was produced using virtual production techniques and a companion Augmented Reality experience to extend the film beyond the screen and into the audience’s home. Terry Persun Film and TV Aug 26, 2025 Cool Stuff The Disney short film, Remembering , by Emmy® Winning director Elijah Allan-Blitz, stars Academy Award® winner Brie Larson as a writer who loses an important idea. Her inner child then goes on a journey to find it. The director and Disney wanted a powerful way to elevate the viewing experience of the film for Disney+ Subscribers. The challenge was to capture the surprise, joy, and wonder Larson’s character feels and allow Subscribers to feel those same emotions. Disney and StudioLAB, The Walt Disney Studios' advanced development division for innovation in creative technologies, pulled together two important technologies—virtual production and augmented reality—to produce Remembering— through the use of game engine assets filmed in an LED stage, and using the full potential of virtual production techniques. The company then published those assets into a mobile Augmented Reality application that was also built in a game engine to extend the 'World of Imagination' beyond the screen and into the audience's home. Available via an app to select Disney+ subscribers, this first interactive content on Disney+ was triggered by specific moving images. When prompted, subscribers simply hold their device up to their TV screen to see an enchanting waterfall scene expand into their real-world living room, complete with frolicking dolphins, bright blue butterflies, and blossoming trees. When developing the app, ease of use was top of mind. So, all it takes is for the Disney+ subscriber to scan the room they’re viewing the film in, then wait for the prompt in the movie to hold up their device. According to Director, Elijah Allan-Blitz, “This is part of the future of how humanity will interact with entertainment. The Augmented Reality experience moves away from a typical passive experience of streaming and allows viewers to engage with it in a deeper way. That makes it something that you’re going to remember on a deeper level than just something you watch. It’s actually something you did.” Remembering: The AR Experience is the first Augmented Reality app that connects directly in sync with content on a streaming platform. This first-of-its-kind companion app provides an early look at the potential of AR experiences to enhance movie storytelling when viewers are watching at home. For more information: Magnopus AR Disney+ Previous Facebook LinkedIn Copy link Next
- Virtual Production Stage for Film
Filmmakers and businesses get a creative refuge where they can plan, shoot, edit, score, and finalize projects. Virtual Production Stage for Film Filmmakers and businesses get a creative refuge where they can plan, shoot, edit, score, and finalize projects. Edited by Terry Persun Film and TV Jun 24, 2025 35North Studios is a state-of-the-art production studio that allows creators a peaceful space to focus on their craft and enjoy the process. The company’s full-service approach provides their clients with a well-equipped studio situated in Clear Lake, Iowa. At its 12-acre campus, 35North Studios operates out of a 225,000-square-foot-facility that includes soundstages and editing suites, in addition to a recording studio, equipment rental house and production office space. With an eye on the trends shaping entertainment and production, 35North Studios’ executive leadership began paying close attention during the pandemic lockdowns when virtual production projects started to accelerate. They conducted extensive research and evaluations into virtual production methodologies and the technologies that enable them. Soon after, they committed to building their own virtual production stage. Also from Entertainment Engineering: Photo courtesy of 35North Studios. “It’s just ingrained in us to always be looking forward and to stay ahead of the curve with the latest industry tools,” said Justin Fairfax, Director of 35North Studios. “We also saw the opportunity to be an early adopter in the Midwest.” Technically Advanced 35North Studios’ virtual production workflow features an OptiTrack camera tracking system—a 3D optical tracking technology with sub-millimeter accuracy for virtual production stages and other industry applications. OptiTrack is a 3D precision tracking systems that provides low latency output, easy to use workflows, and a host of developer tools. The system’s primary markets include drone and ground robot tracking, movement sciences, virtual production and character animation for film and games, and mixed reality tracking. The specific OptiTrack system used by 35North Studios is comprised of 12 SlimX13 cameras—a lightweight, high frame-rate capture product that includes a discreet profile and is designed with simplicity and usability in mind. The system also includes CinePuck, a camera tracking tool for virtual production studios that can be seamlessly integrated into any production workflow. The studio’s stage is also equipped with fine pitch LED video walls and ceiling from OptiTrack’s sister company, Planar, a leading provider of LED display solutions, and ARRI cinema cameras. Additionally, 35North Studios custom built all of its rendering nodes and computer hardware systems. Flexibility and Stability 35North Studios selected an OptiTrack system after evaluating the different types of motion capture technologies including how each one would support their vision for the new virtual production stage, both immediately, and long term. “We wanted our LED ceiling to be a reflective surface at all times, which automatically ruled out inside out camera tracking ,” Fairfax said. “If we had to place a bunch of trackers on the ceiling for positional tracking, then that would mean they would be visible in the reflections. We wanted to avoid that.” Because of their need for more creative latitude, their search eventually landed on optical camera tracking and the OptiTrack system. “For us, it’s all about flexibility and stability,” Fairfax said. “With an OptiTrack system, we are not tied solely to virtual production. For example, if we decide at a later stage to invest in an animated feature that needs motion capture, we can do that.” The flexibility of OptiTrack proved beneficial when 35North Studios wanted to expand its tracking volume. “We decided to also track our side walls to Unreal, which allows us to avoid having to remap everything every time we move one of those mobile walls,” Fairfax said. “That wouldn’t be possible using different tools.” The decision to integrate an OptiTrack system was also based on 35North Studios’ set up to use two ARRI cameras in their virtual volume and to be able to track both at once. OptiTrack provided the ability to track props, which the company can build into an active tracker and send to Unreal in real-time. “It’s such a multipurpose tool,” Fairfax said. According to Fairfax, OptiTrack provides several other advantages—the system is user-friendly, the equipment is dependable, and the software is easy to learn and understand. But what stands out the most is that the technology is virtually unnoticeable. “I haven’t had to worry about it being visible once on set,” Fairfax said. “It’s never a thought in my mind.” For information: OptiTrack Planar Epic Games ARRI Brompton Technology Previous Facebook LinkedIn Copy link Next












