top of page

Search

159 results found with an empty search

  • To Capture Fast-Moving Subjects, Photographers Need the Right Equipment

    Gimbal fluid head enables smooth, shake-free panning without micro-jitters for perfect shots. To Capture Fast-Moving Subjects, Photographers Need the Right Equipment Gimbal fluid head enables smooth, shake-free panning without micro-jitters for perfect shots. Edited by Terry Persun Film and TV Oct 10, 2025 A photograph is more than an image—it’s a piece of art suspended in time. It captures light, emotion, and motion, distilling them into something lasting. Some photographs tell stories. Others reveal the unseen. The finest do both—and they endure. One of nature photography’s most captivating subjects is the Gruccione , or European Bee-eater—a bird whose radiant plumage and elegant flight have made it a favorite among wildlife photographers such as Michael Lovera. Yet the Gruccione is far from an easy subject to photograph. Found along riversides and meadows from late spring to mid-summer, the Bee-eater’s beauty is rivaled only by the challenge of capturing it. Migratory by nature, the Bee-eater spends winters in sub-Saharan Africa and returns to Europe each year to breed. This long journey across continents makes it a symbol of seasonal change, endurance, and ecological connection—a perfect metaphor for timelessness, motion, and return in visual storytelling. Its plumage is among the most colorful in Europe, with a turquoise chest, golden-brown back, lemon-yellow throat, and bold black eye stripe. It nests in colonies, digging long horizontal tunnels—sometimes stretching 1–2 meters— into sandy riverbanks or soft cliffs. These nesting sites, often reused year after year, are typically precarious and hard to access, requiring photographers to set up on unstable ground or even in water. As its name suggests, the European Bee-eater feeds mainly on flying insects such as bees and wasps, which it catches mid-flight with remarkable precision. Before swallowing stinging prey, it performs a dramatic behavioral flourish: rubbing the insect against a branch to remove the stinger. Capturing this moment adds narrative depth to a photo and rewards the patient, observant photographer. During nesting season, typically between May and July, Bee-eaters establish colonies along exposed slopes or sandy banks. Photographers seeking the most compelling images often work near these sites, building camouflaged hides and enduring long, silent hours in often extreme environments. High temperatures, harsh light, persistent insects, and unpredictable terrain make reliability non-negotiable. Gear with Unshakable Stability On Michael Lovera’s journey to capture the Bee-eater, the Gitzo Systematic Tripod and Gimbal Head proved indispensable. The four-section systematic tripod, combined with the G-lock system, allowed for quick and precise height adjustments and adaptability on any terrain—from riverbanks to uneven ground—while providing excellent stability for ground-level shots, which Lovera noted as a personal favorite. Rubber articulated feet played a crucial role in maintaining grip on difficult surfaces, including slippery submerged rocks under flowing water. The tripod absorbed shocks and kept the setup steady and secure from the first moment. Thanks to its fluid movement and total control, the Gitzo Gimbal Head performed flawlessly not only in photography, but also in video work—enabling smooth, shake-free panning without micro-jitters. The ergonomic knobs with superior grip ensured full control, even when wet or in contact with water—a common scenario in wildlife photography where fast reactions are everything. The carbon fiber legs of the Systematic and the magnesium construction of the Gimbal made the entire system light and easy to carry across long treks and difficult landscapes. For wildlife photographers constantly balancing performance with portability, this combination delivered the perfect compromise between quality and weight. “In these products, I found the security I was looking for—without giving up anything else—security, stability, or elegance. Gitzo provided the stability I needed to shoot with absolute precision—even when partially immersed in water.” This article is part of the “Photography That Lasts Forever” campaign by Gitzo. This was a Gitzo adventure with Michael Lovera. For more information: Gitzo GT4543LSUS — Gitzo tripod Systematic, series 4 long, 4 sections . GHFG1 — Gimbal Fluid Head. Michael Lovera on Instagram Read more about photography . Previous Facebook LinkedIn Copy link Next

  • World’s Tallest Observation Wheel

    Caesars Entertainment in Las Vegas uses SKF giant main bearings and related technologies. World’s Tallest Observation Wheel Caesars Entertainment in Las Vegas uses SKF giant main bearings and related technologies. SKF Home Cool Stuff Jun 4, 2025 Theme Parks The High Roller observation wheel expected to soar at the heart of the world-famous Las Vegas Strip in late 2013. The 550-foot-tall wheel, eclipsing the famed 443-foot-tall London Eye, will be equipped with two SKF spherical roller bearings, the largest ever produced at the SKF manufacturing facility. The two custom-designed spherical roller bearings (one within each side) will be virtually unprecedented in weight and size. Each bearing will weigh approximately 8,800 kg and be designed with 2,300mm outer diameter, 1,600mm inner diameter, and width of 630mm. Specially engineered features will include W26 lubrication holes in the inner ring, SKF NoWear®-coated rollers, and PTFE coating in the bore. The company will additionally incorporate advanced lubrication, sealing, and online condition-monitoring systems and take the lead in monitoring the bearings once the wheel is up and running. SKF was awarded the contract by American Bridge Company (Coraopolis, PA, USA), which is responsible for constructing the 143m-diameter tension wheel. In addition to the two SKF bearing assemblies, the structure will consist of four steel support legs, a single braced leg, fixed spindle, rotating hub, 2m-diameter tubular rim, and 112 locked coil cable assemblies as spokes. The High Roller wheel is the centerpiece of The LINQ, a planned $550 million, open-air retail, dining, and entertainment district situated between Imperial Palace and Flamingo Las Vegas. Each of the wheel’s 28 supersize cabins (which themselves will revolve) will accommodate up to 40 people during a 30-minute ride. The wheel turns out to be the latest SKF big-wheel project in a growing portfolio of expertise. The Navy Pier® Ferris wheel in Chicago similarly operates with two SKF spherical roller bearings and SKF Reliability Systems retains responsibility for ongoing proactive maintenance programs. For more information: SKF Home Previous Facebook LinkedIn Copy link Next

  • How Motion Control Makes Tron: Ares a Powerful New Visual Experience

    “We were pushing the limits of the robots.” SISU Cinema Robotics and reaching new heights with motion control robotics. How Motion Control Makes Tron: Ares a Powerful New Visual Experience “We were pushing the limits of the robots.” SISU Cinema Robotics and reaching new heights with motion control robotics. Joe Gillard Film and TV Aug 12, 2025 Tron: Ares is the third installment of Disney's Tron franchise. The film stars Jared Leto, Greta Lee, Evan Peters, Hasan Minhaj, and other, as well as engineers, cinematographers, and others working behind the scenes to create the movie magic that’s possible in 2025. ALSO FROM EE: WATCH: One-on-One with Chris Porter of igus Tron has always made use of innovative special effects and this latest installment carries on that technological legacy. In an interview SISU Cinema Robotics, the company responsible for motion control robotics on the film, explained a little bit about the innovations in motion control used. Films make use of robotics and motion control to capture interesting or difficult shots, allowing the camera to jump or move around an object or person in quick or unusual patterns. Scary shots in Tron: Ares Cinematographer Jeff Cronenweth and SISU's Mike Morgan explained how the safety-conscious SISU Cinema Robotics pulled off some harrowing shots, in an interview published on YouTube. “We had some shots in our film where we had Jared Leto laying on his back and a 12-inch probe lens half an inch from his eyeball and rotated the camera up and then pulled back away from him, and that is terrifying,” said Cronenweth. The motion control and robotics are a crucial part of how the shots were able to get the desired look. “We wanted to create, in this digital world, when they’re in the web, this very mechanical program perspective of things,” said Cronenweth. Watch the full interview below: And here is a trailer for TRON: Ares: Learn more about SISU Cinema Robotics Learn more about Tron: Ares . Previous Facebook LinkedIn Copy link Next

  • Giant Laser-Headed Robot Dances at Drake Concert

    Industrial class robots holding other robots deliver a light show with precision of motion with the grace of a dancer. Giant Laser-Headed Robot Dances at Drake Concert Industrial class robots holding other robots deliver a light show with precision of motion with the grace of a dancer. Terry Persun Stage Events Sep 11, 2025 Cool Stuff A humanoid robot is basically a collection of robots mounted to each other. When designing the entertainment robots for the Drake concert, andyRobot had to be very careful. The robot heads held super powerful lasers that had to be aimed away from the crowd for safety purposes. Yet, they also had to present spectacular choreography for the viewing audience. The application of standard industrial class robots by KUKA is enhanced by using them in an entertainment environment. andyRobot uses the same exact robots you might find in a manufacturing plant in a way that provides smooth, high precision motion night after night, even when the robots are continually moved from one location to the next. At each new arena, a team goes in to chalk out the location of the robots in four corners—often up to 200 feet apart from one another—which are brought in by semi-truck, carefully unloaded, and placed into precise location. The combined components are assembled to create each humanoid robot that rises above the crowd by over eight feet. Each multi-ton humanoid has a laser for a head and arms that hold 16-inch mirrors in each hand. During the Drake concert, the humanoids shoot lasers out of their heads and bounce them off the mirrors held by other robots standing across and diagonally from them, creating a spectacular lightshow over the heads of the audience. The overall show complexity of movement was created through andyRobot’s proprietary software Robot Animator. Robot Animator employs what is called inverse kinematics to control multiple robots. Inverse kinematics comes from computer animation and refers to how that field makes characters move on screen. In the physical world of robotics, this is when the robot follows a trajectory to reach an end point which makes the movement look more natural. Robot Animator is a plug-in that works inside Autodesk Maya. Because robots used in manufacturing move quickly until they are in position then stop abruptly, perform the operation, and move quickly away, there is a jerkiness to the movement. Think of it this way, in a stage production at a concert you want the precision that is required in a manufacturing facility only you don’t want the jerky start and stop motion. The robot movements created by Robot Animator must vary from slow to fast to follow the music being played, producing surgically exact movements at every show. Robot Animator is able to smoothly ramp up and ramp down every motion—and do it with precise accurately. According to Andy Flessas, President of andyRobot, “Grace of movement is created through Robot Animator’s Motion Planning software, which translates industrial robot motion into the language of animation.” All photos courtesy of andyRobot. Basically, to do this, the software must ease in and ease out of every movement. For example, it can take over thirty separate micro moves to provide one second of actual robot motion, all of which allows the humanoid robots at the Drake concert to look like they are dancing. The robots used in the Drake Concert include four KUKA KR210 robots, each with six axes of movement, and eight KUKA KR10 robots, with six axes of movement. All told, each of the four humanoid robots incorporates 18 axes of motion that must be precisely choreographed with one another. That’s 72 axes of motion that had to be programmed for a two-hour concert, second by second. andyRobot provides the key components necessary for all types of entertainment using the same robots you might see in any manufacturing plant. Adjusting them for multiple movements not normally found in industry, andyRobot exploits the precision required in manufacturing into a creative experience for a large audience. For more information: andyRobot Autodesk Maya ABB Robotics KUKA Robotics Previous Facebook LinkedIn Copy link Next

  • The Wizard of Oz at Sphere in Las Vegas Will Use Google’s AI to Create an Immersive Experience

    Generative AI will expand scenes and enhance characters The Wizard of Oz at Sphere in Las Vegas Will Use Google’s AI to Create an Immersive Experience Generative AI will expand scenes and enhance characters Joe Gillard Film and TV Jun 7, 2025 Stage Events Google and Sphere Entertainment, the immersive entertainment venue in Las Vegas, Nevada, announced a partnership to develop The Wizard of Oz at Sphere using generative AI (gen AI), according to a press release . The companies say the collaboration will involve engineering work, as well as “thousands of creators, coders, VFX artists, and more,” to present the immersive experience which opens in Las Vegas on August 28, 2025. Google Cloud and DeepMind will deploy Gemini models Veo 2 and Imagen 3 to enhance the film's resolution, extend backgrounds, and digitally recreate existing characters who would otherwise not appear on the same screen. Sphere is also using Google Cloud's AI-optimized infrastructure to support the data and computational demands of immersive experiences, with The Wizard of Oz at Sphere processing 1.2 petabytes of data over the course of the project to date. Google is one of the many tech giants ramping up AI across multiple domains, though the film industry, in particular, has been somewhat hesitant to embrace AI . Nevertheless, Google is moving forward in this attempt to combine cinema with AI for an immersive entertainment experience. "Our partnership with Sphere on The Wizard of Oz at Sphere is a great example of pushing the boundaries of generative AI to deliver exciting new experiences for audiences – and new opportunities for studios and filmmakers," said Thomas Kurian, CEO, Google Cloud. "We are honored to play a role in such an ambitious project to bring a classic piece of Americana to an entirely new generation of audiences." Google AI and The Wizard of Oz at Sphere Originally released in 1939, The Wizard of Oz was filmed using what was at the time a revolutionary, three-strip Technicolor 35mm motion picture camera and was only the third Hollywood production to bring this color process to cinema audiences. Google and Sphere are quick to point out that this same film is part of their own attempt to innovate nearly 90 years later. Sphere will bring an immersive version of The Wizard of Oz to its 160,000-square-foot interior display plane, using Google AI alongside traditional VFX and film techniques to expand scenes and enhance characters. Google Cloud and DeepMind are employing Gemini, Veo and Imagen models, as well as Google Cloud infrastructure such as the company's custom AI accelerators, Tensor Processor Units (TPUs), Google Kubernetes Engine (GKE), and more. One technique being used for the film is called “super resolution,” intended to “intelligently enhance” the film's resolution, filling in missing pixels and creating an ultra-crisp 16k image, essential for Sphere's 16k x 16k resolution interior display plane. Other techniques include “outpainting” to expand the film's visual scope for Sphere's immersive environment, and extend the backgrounds and characters, “performance generation” for storytelling techniques that allow multiple characters to remain on screen for extended periods, and context window for ensuring that the enhanced visuals remain consistent throughout the film. "The power of generative AI, combined with Google's infrastructure and expertise, is helping us to achieve something extraordinary," said Jim Dolan, Executive Chairman and CEO, Sphere Entertainment. "We needed a partner who could push boundaries alongside our teams at Sphere Studios and Magnopus, and Google was the only company equipped to meet the challenge on the world's highest resolution LED screen." Previous Facebook LinkedIn Copy link Next

  • Georgia Aquarium Uses Technology to Give Low Vision Visitors an Enhanced Experience

    The aquarium is partnering with ReBokeh to enable low vision visitors to explore the aquarium using their own eyesight. Georgia Aquarium Uses Technology to Give Low Vision Visitors an Enhanced Experience The aquarium is partnering with ReBokeh to enable low vision visitors to explore the aquarium using their own eyesight. EE Staff Theme Parks Sep 17, 2025 Museums, Cool Stuff Georgia Aquarium is partnering with ReBokeh Vision Technologies to offer free access to ReBokeh’s assistive technology software for people with low vision. The partnership provides Aquarium guests and staff with access to ReBokeh’s app-based software intended to help people with low vision to adjust the appearance of the world around them. Georgia Aquarium is the largest aquarium in the Western Hemisphere, with more than five-hundred species. This partnership will allow “the 90% of low vision individuals who retain functional vision to experience the wide range of animals and exhibits using their own vision, rather than defaulting to tactile or audio descriptions,” according to a press release. The app-based technology behind it ReBokeh’s technology works through the live camera feed from their mobile devices with overlayed, customized filters that adjust aspects like contrast, color hue, zoom, and lighting to meet the needs of low vision users. An AI feature also offers the interactions with ReBokeh’s AI tool with which guests can ask questions about what’s around them and what they’re seeing. It also works in multiple languages so it can work as a translation tool for signage, information, and visual surroundings. Watch to see how the app's general technology works “Our key mission is to unlock the ocean for all; that includes providing features and opportunities like ReBokeh’s technology for guests so they can experience our larger-than-life animals and the wonder they invoke. This partnership with ReBokeh is integral to our continued commitment to accessibility for all,” said Sam Herman, Director of Guest Programs at Georgia Aquarium. “A day at the Aquarium is an opportunity to see the magic of our oceans and the incredible variety of wildlife that call the sea home,” said Rebecca Rosenberg, the low vision founder of ReBokeh. “Being able to see and interact with each exhibit using your own eyesight can be an incredibly powerful experience for people with low vision. We couldn’t be more excited to partner with Georgia Aquarium to make them the first aquarium on the planet to create these new and immersive experiences for the low vision community.” ReBokeh is extending this partnership opportunity to other museums, zoos, and public spaces. Source: https://www.rebokeh.com/ Previous Facebook LinkedIn Copy link Next

  • Robots Used in Food Preparation, Serving, and Delivery

    The food and beverage industry is one of the largest and most dynamic sectors globally, with new trends and demands continuously shaping the market. Robots Used in Food Preparation, Serving, and Delivery The food and beverage industry is one of the largest and most dynamic sectors globally, with new trends and demands continuously shaping the market. Terry Persun Cool Stuff Jul 9, 2025 Technological advancements have brought about several changes in the food and beverage industry, with robotics being one of the most impactful. Robots have been introduced to perform various tasks, ranging from food preparation to serving and even delivery. Here is a rundown on some of the latest advancements regarding robotic technology and the benefits that come with it. Food Preparation Traditionally, food preparation has been a time-consuming and labor-intensive process, involving the use of multiple kitchen tools and equipment. However, robots have made food preparation more efficient and cost-effective by handling tasks ranging from chopping and slicing to mixing and cooking. Food prep robots are designed to work alongside human employees, with minimal supervision, and can handle large quantities of food. For example, Moley Robotics has developed a robotic kitchen that can prepare thousands of recipes with precision and speed. The robots measure and weigh ingredients with high accuracy and are incredibly fast during operation, helping to increase the output of a kitchen to ensure that orders are fulfilled promptly. Plus, robots offer consistency, which is an important factor in food preparation and assures that the final product will always have the same quality, regardless of who is preparing it. Moley Robotics has systems that can cook for you, teach you to cook, and more. Robotic Serving Photo courtesy of Lin Engineering. The traditional method of serving food and drinks in the food and beverage industry has always involved human servers. However, robots are now being used to serve food and drinks to customers in restaurants, cafes, and other food establishments. These robots are designed to be interactive and engaging, with some programmed to communicate with customers directly. For example, the robot Servi developed by Bear Robotics is designed to serve food and drinks to customers and navigate through crowded restaurants. Servi is equipped with proximity sensors that help it detect obstacles, enabling it to navigate through tight spaces. Additionally, Servi is designed to interact with customers, engaging them in conversation and providing them with a unique dining experience. Robotic serving can not only enhance the customer experience but help to reduce labor costs so that funds can be redirected toward other areas of the business. Robotic serving can also improve service speed, reducing waiting times and increasing customer satisfaction. Robotic Delivery Robotic delivery is a new and exciting development that provides an innovative solution to multiple food delivery challenges. With the increasing demand for food delivery services, the use of robots to deliver food and drinks has become more prevalent. These robots are designed to navigate through city streets and traffic, delivering orders to customers efficiently and effectively. Photo courtesy of Lin Engineering. One example is the robot developed by Starship Technologies, which is equipped with sensors and cameras that enable it to navigate through city streets and avoid obstacles. The robot can be programmed to deliver orders to specific locations, and customers can track the progress of their orders through a mobile app. These robots can travel up to 4 mph and deliver orders within a two-mile radius, making them ideal for delivering food and drinks to customers in urban areas. Overall, robotic delivery reduces delivery costs and improves delivery speeds, which increases customer satisfaction. Custom Robotic Solutions In addition to the many benefits that robotic technology brings to the food and beverage industry, custom motion control solutions from Lin Engineering can further enhance these advantages. Lin Engineering specializes in creating custom motion control solutions with products such as stepper motors, brushless motors, linear actuators, servos, motorized traction wheels, and other products. In the food and beverage industry, Lin Engineering provides motion control solutions that improve the performance and efficiency of robotic equipment. The company’s solutions enhance the precision and speed of robotic food preparation, serving, and delivery, ensuring that food establishments can meet the demands of their customers. By partnering with Lin Engineering, food and beverage robotics manufacturers can benefit from custom motion control solutions that are tailored to their specific needs. These solutions can help to reduce costs, improve efficiency, and enhance customer experience, making them a valuable addition to the food and beverage industry's robotic technology. For more information: Moley Robotics Bear Robotics Starship Technologies Lin Engineering Previous Facebook LinkedIn Copy link Next

  • Electric Mobility Technology Meets Utility With This Highly Versatile Vehicle

    Innovation has no bounds when an engineer tackles an industry need and comes up with a single e-mobility concept to fit multiple applications. Electric Mobility Technology Meets Utility With This Highly Versatile Vehicle Innovation has no bounds when an engineer tackles an industry need and comes up with a single e-mobility concept to fit multiple applications. Terry Persun Sports Nov 6, 2025 Cool Stuff E-mobility is flourishing around the world with everything from skateboards to trucks. But what isn’t available is a modular concept that is highly versatile and powerful. Envo has been involved in designing and manufacturing versatile mobility solutions for commuters, recreational use, and for utility operations offering everything from electric bikes, snow bikes, and trikes. Their latest endeavor is the Utility Personal Transporter (UPT). According to Envo Founder and CEO, Ali Kazemkhani, “Our forward-thinking team recognizes significant gaps in the e-mobility industry, particularly between e-bikes and e-cars, for both personal transportation and cargo/utility purposes. These untapped opportunities hold immense potential in the market for products like our new UPT. With it we’re introducing a highly versatile, 4-wheeled mobility platform, with niche futuristic micro-mobility options as clean alternatives to UTV/ATVs, Cars, and Trucks.” The innovative UPT is a powerful, long-range, all-wheel-drive utility platform. This flexible vehicle starts with a skate-board chassis similar to what you might find on a golf cart except that the basic unit doesn’t have a body or seats. Instead, the UPT offers a wide range of possible configurations designed to handle anything from garden/home improvement jobs to snowplows to backcountry rescue vehicle and much more—including equipment carrier on movie/TV sets, camera mounting for motion capture, or for simply ferrying people or equipment from one place to another when converted into a fully covered micro e-car. The direct-drive PMSM (permanent magnet synchronous motor) hub motors were designed in-house because of the specific geometry and features required of the UPT, such as high-efficiency, high-performance, lightweight, and modularity. Afterward, the motors were assembled, wound, and tested at a motor factory before delivery to Envo. Each motor is a 23-pair pole motor that provides speeds up to 60 kph with a maximum torque of 140 Nm. They are IP67 compliant, which means that they are dust-tight and protected from short-term water immersion of up to one meter for thirty minutes. Images courtesy of ENVO. The four wheels are independently controlled and managed through a main VCU (Vehicle Control Unit) mounted inside the main chassis. The VCU controls motor-wheel behavior based on driving demands such as cruise, traction, anti-slip, tank turn, and other driving modes including sport/eco, 2WD, 4WD, and many more. The controller is modular and updatable for a limitless variety of use cases. A dashboard VCU is available with an HMI (Human Machine Interface) that is external to the main chassis. This touchscreen display not only indicates vehicle driving information but also controls all components and accessories that can be added to the vehicle—module by module. Further, the HMI is CarPlay/Android Auto capable for all other apps used in modern cars. Through the use of a CANBUS network, any future electric or powered accessory could be added and controlled by the same HMI dashboard. Anything that can be controlled or monitored by the HMI dashboard can also be controlled remotely. Images courtesy of ENVO. The dashboard VCU is responsible for matching the vehicle user interface with the main VCU, which means that vehicle developers or individuals (including ENVO) will have different options for controlling the powertrain. Handlebar, thumb throttle, brake lever, or steering wheel with pedals are all possible. Even steer-by-wire and brake-by-wire for robotic and autonomous application is capable, while everything communicates with the main VCU to drive the vehicle. The UPT’s modular design, flat floor surface, and array of attachment points are the key to its configurability. This base platform features a fold-up, telescoping steering column for standing operation—plus an adjustable fold-down seat/leaning support—and allows for simple, narrow-profile upright or stacked storage and transport. From there, users can install accessories like high-capacity carriers and trays, a variety of seating options, gear racks, weather protection, and more, plus front and rear hitches for plowing and hauling. With payload capacities of up to 250 kg and towing capacities to 350 kg, the UPT is capable of traveling over 100 km. In addition, the unit boasts 12,000 watts of power harnessing 640 Nm maximum torque. Key design and functionality features include a fully adjustable upright handlebar for use in left, right, or center location; integrated headlights, taillights, and indicators; and multiple attachment points for various configurations. The flat deck allows for maximum cargo capacity and a pass-through area allows storage or transport of longer items. The handlebar and seat fold to allow the UPT to be stackable as well as to provide stand-up and compact storage. Images courtesy of ENVO. While the company continues to beta test the UPT, they expect to start deliveries in less than a year and are already taking pre-orders. The company also has early interest from the likes of law-enforcement groups, rescue teams, hunting/fishing organizations, construction companies, park/forest administrators, commercial farms, entertainment venues, and general consumers. UPT is a mass-customization platform that incorporates standardized modules that assemble into various LEV chassis. The company expects to scale by leveraging its existing supply chain and in partnering with small to medium manufacturers worldwide to deliver locally tailored vehicles for personal, commercial, and government needs. ENVO will provide the IP and upstream components while local partners will build and support the local community. This approach acknowledges the niche nature of the product and its need for deep customization. The goal is to address varied mobility needs using a fully customized mobility solution—the UPT platform. The company is presently seeking partners in the USA, India, China, Europe, Canada, the Middle East, and ANZ for local production. For more information, visit ENVO . More articles about vehicles >>> Previous Facebook LinkedIn Copy link Next

  • WATCH: One-on-One with Chris Porter of igus

    An insightful dialogue about what goes into designing something new and how it relates to other projects. WATCH: One-on-One with Chris Porter of igus An insightful dialogue about what goes into designing something new and how it relates to other projects. Videos Aug 11, 2025 Theme Parks In a casual conversation with the Industry Manager for Stage & Amusement in Central Florida for igus, Inc., Entertainment Engineering explores how closely designers get to their projects, how they start the process of designing something new, and how they are always working on several projects in several different industries at the same time. In this video, we show how creativity and inspiration can come from anywhere and how easily it can reshape a solution to fit a specific application. Whether working with off-the-shelf components or thinking up a completely off-the-charts custom approach, it’s all about brining your experience as well as an open mind to the table. Note to our audience: Technology Transfer DIY Stories Entertainment Engineering Magazine is looking for stories about your home projects where you take the technological experience you have at work and apply it to your home projects. I know engineers in aerospace who have rebuilt their lawn mowers, engineers in medical devices who built their own toy rocket. The spark of inspiration can come from anywhere. And once that spark is enflamed the design can morph and adapt along the way. We’re interested in your non-work-related projects. What are you designing at home, for yourself or a neighbor or relative? Is it a Halloween project? A Christmas project? Or just a backyard thing you’re working on? Here’s what we’d like to see: 1) What got you thinking about the project in the first place? 2) What is your background and why you thought you could tackle the project? 3) What technologies did you use (electrical, electronics, mechanical, fluid power, materials, etc.)? 4) How does the end product work and how did it do what you needed it to? 5) Answer those four questions in less than 800 words (and focus on the tech), then send us your story, some photos, and even a video if you have one. We can’t publish every story, but we can choose a few a month to publish. If you want to read an example, check this out: You can reach us at: Contact@EntertainmentEngineeringMagazine.com Previous Facebook LinkedIn Copy link Next

  • Custom Software and Touch Screen Technology Used for Wine Tasting Experience

    A first-of-its-kind interactive digital touch table and software package is custom built for wine tasting. Custom Software and Touch Screen Technology Used for Wine Tasting Experience A first-of-its-kind interactive digital touch table and software package is custom built for wine tasting. Jim Spadaccini, Founder & Creative Director of Ideum Cool Stuff Jun 16, 2025 I am thrilled to be the first to announce that Ideum has released a new touch table called the Tasting Table . It comes bundled with software that facilitates wine tasting. The Wine Experience allows guests to learn more about the tasting process and share what they experience. The software provides a new way to experience wine, demystifying and deepening the tasting experience—part of the company’s Sensory Dining experience. Photo: Nebbiollo ripens on the vine in the author's backyard vineyard. It is not hyperbole to say that we’ve been working on this new interactive experience for decades. The last project I managed at the Exploratorium in San Francisco was the Science of Wine project, during which I was introduced to Ann Noble, the U.C. Davis Professor who developed the Wine Aroma Wheel, a custom and highly modified version of the open-source Wine Aroma Wheel which is important in The Wine Experience application. We’ve made it interactive, where guests can select and share the flavors they are tasting. Analytics are built into the software, so a winery that uses this application can see what their guests are experiencing, effectively crowd sourcing their tasting notes. How it Works The Wine Experience software is paired with our new Tasting Table, developed explicitly for tasting rooms, popup events, and other spaces where wine tastings happen. The Tasting Table has a unique design: It is bar-height and has an optically bonded 55-inch touch display. An onboard computer makes the system plug-and-play. The system is lockable, and the software only requires an internet connection to load new content. In addition, the experience uses our object-recognition system, Tangible Engine. This software allows the interactive coasters that are part of the Tasting Table to recognize up to eight different wines. Tangible Engine was the first object-recognition software package for projected-capacitive touch tables; many design firms, including ours, use it. Over the years, we developed experimental applications that involved tasting. The Interactive Coffee Experience was designed with Starbucks and appeared at several pop-up events. We also developed an interactive wine-tasting experience, the JCB Tasting Salon with JCB Wines. More recently we worked with MSC Cruises to create the Interactive Wine Bar for the Euribia cruise ship. These experiences and the great and knowledgeable collaborators we worked with have contributed to our thinking as we developed this exciting new wine-tasting experience. In addition, we worked closely with VARA Winery & Distillery, who provided excellent feedback during our extensive testing and tasting sessions! Photo: Ambassador and former NM Senator Tom Udall & entrepreneur Steve Case visit Ideum and try an interactive coffee tasting developed with Starbucks. Our mission was to create a guest-centric experience, focusing on individuals tasting the wine. We want to enhance the tasting experience, making the interactive less of a brochure for the winery and more about the social experience surrounding tasting and the joy of tasting fine wines. I’ve been interested in wine since working in a wine shop and a restaurant in my early twenties. I also have a small vineyard in Corrales, New Mexico, with 150 vines I’ve been growing for over a decade. This project blends my personal and professional interests like no other endeavor has. We’ve had our first pop-up event with VARA, with more planned, and will have our first permanent installation at the New Mexico Wine Association’s new tasting room in Old Town later this year. For more information: See the video Ideum Tangible Engine Sensory Dining Website Wine Aroma Wheel VARA Winery & Distillery New Mexico Wine Association Previous Facebook LinkedIn Copy link Next

  • Groundbreaking Dance Performance Uses Motion Capture, Aerial Drones, and Visual Effects

    How technology and performance combine to create expressive storytelling and audience immersion. Groundbreaking Dance Performance Uses Motion Capture, Aerial Drones, and Visual Effects How technology and performance combine to create expressive storytelling and audience immersion. Jeff Gunderson Stage Events Nov 11, 2025 The Netherlands-based contemporary dance company, Another Kind of Blue (AKOB), has received widespread acclaim for its groundbreaking performances that explore the relationships between humans and technology. Founded by visionary choreographer David Middendorp, the company not only uses technology as a subject of exploration but also as a dynamic tool for expressive storytelling. "I've always been fascinated by the intersection of culture and technology," said Middendorp. "People often perceive them as separate, but I firmly believe they are closely related. Technological innovations are often born from someone's dreams. For instance, airplanes would never have been invented without the dream of flying. And I believe technology plays a significant role in shaping human nature. It contributes to our sense of identity." Photo by Kim Vos Fotografie, courtesy of Some Kind of Blue and OptiTrack. Middendorp talent caught the attention of "America's Got Talent," where his choreographed performances reached the finals. Encouraged by the experience, Middendorp decided to establish AKOB. Several productions soon followed. AKOB's artistic live dance performances feature imaginative uses of motion capture (mocap) technology, aerial drones, digital elements, and real-time visual effects and animations, captivating audiences with experiences that are both mesmerizing and thought provoking. Photo by Kim Vos Fotografie, courtesy of Some Kind of Blue and OptiTrack. Discovering Motion Capture Middendorp's exploration of aerial drones stemmed from his desire to examine the concept of free will. The early stages of the concept involved someone remotely controlling a single drone from the wings. Then, the idea evolved into a swarm of drones that would form into certain shapes. However, 10 to 20 people operating drones in synchronized patterns proved impossible. “We started to search for solutions," Middendorp said. "One idea was to build our own localization system. Then, as we were looking at other possibilities, we discovered OptiTrack.” Middendorp purchased his first eight OptiTrack cameras early in his explorations and has slowly been accumulating more. Today, AKOB has 22 OptiTrack mocap cameras—16 Primeˣ 13 cameras and 7 Primeˣ 22 cameras. The configuration provides real-time, low latency tracking that optimizes accuracy across capture areas. Photo courtesy of OptiTrack. As Middendorp became more proficient with the mocap system, his team began using it to track dancers on stage wearing OptiTrack mocap suits. Using positional data, the dancers' movements were translated into commands for controlling the drones in real-time. The subsequent performance, “Airman,” featured 12 drones, some flying pre-determined paths, and others programmed to respond to the dancers’ movements. According to Middendorp, “With OptiTrack, we can track drones, dancers, and other elements. Another advantage is that the infrared cameras work in low-light conditions, which is crucial since we often darken the stage to create a specific ambiance.” Imaginative Uses Another idea conceived by Middendorp was to create a duet that explored interactions between dancers and elements of physics. He envisioned a powerful way to visualize physics through sound waves, and particularly an effect known as Chladni patterns. Named after 18th-century German physicist and musician Ernst Chladni, these intricate patterns emerge when a flat surface with a sprinkling of sand (or similar substance) vibrates at specific sound wave frequencies, causing the sand to move and gather at certain locations on the surface. Rather than making the entire stage shake, he chose to replicate the sand motion through simulation, leading to the development of a dynamic virtual sand representation complete with the ability to form Chladni patterns on a stage. However, for live performances, he needed to depict a convincing engagement between dancers and the simulated sand. OptiTrack was able to help him with a solution. In the performance, “Wave”" virtual sand is projected onto the stage, and dancers, donning OptiTrack mocap suits, are tracked with precision using the OptiTrack system. Similar to the way drones can be controlled, OptiTrack allows the dancers' movements to manipulate the virtual sand in real-time. The result is an immersive presentation where the interplay between the dancers and the simulated sand appears remarkably true-to-life. Forging Into New Dimensions The most recent production from AKOB originated from what Middendorp calls, “a fantasy.” He said, “What if you could visit people that aren't here anymore? Maybe they passed away or maybe they just left. But what if there was a virtual space where you could still interact with them? I wanted to use this concept to create a choreography.” His piece, “Missing” (part of the new full-length performance “Digital Twin”), transports audiences into a near future scenario where a dancer engages in a duet with the digital version of someone who is no longer physically here but continues to exist in an alternate reality. The performance offers a profound insight into the possibility of leaving a digital version of ourselves for our loved ones. To create this experience, Middendorp uses a “virtual mirror” on stage comprised of a very large display that rotates during the performance, seamlessly reflecting both virtual and real objects. Using OptiTrack, the hands of dancers on stage are closely tracked, translating gestures into movements performed by digital counterparts in the mirror. Middendorp said, “OptiTrack serves a critical role in the performance for making the display function like a true mirror.” AKOB continues to delve into new realms of creative possibilities, pushing boundaries of contemporary dance, while OptiTrack provides multiple tracking solutions in one system. “I am always learning new things about the system and what it can do. If I have a spare moment, I sometimes just play with it, which is very useful for developing new ideas,” Middendorp said. For more information: OptiTrack OptiTrack Cameras OptiTrack Mocap Software Another Kind of Blue Read more about stage performances >>> Previous Facebook LinkedIn Copy link Next

  • Introducing X1: The World's First Multirobot System

    Introducing X1: The World's First Multirobot System EE Staff Mini Story Oct 27, 2025 Caltech’s Center for Autonomous Systems and Technologies (CAST) and the Technology Innovation Institute (TII) in Abu Dhabi, United Arab Emirates joined forces to design and build a multimodal system. The aim was to create a robot that could fly, drive, and walk so that you don’t have to focus on one area of operation—and purchase three devices. This demo shows how three distinct robot types come together to form a multimodal system that will offer a single solution to multiple challenges. We still think it operates similar to a Transformer, which is pretty cool. Read the whole article from CalTech here . *Photo courtesy of Academic Media Technologies/Caltech. Previous Facebook LinkedIn Copy link Next

bottom of page