Beta

Acoustic holography: Using sound waves to levitate matter | with Sriram Subramanian

Below is a short summary and detailed review of this video written by FutureFactual:

Levitation with Sound: Acoustic Traps, Holographic Fields, and Multimodal Feedback

Short summary

In this talk, the presenter demonstrates acoustic levitation and acoustic holography using ultrasonic transducers. He explains how phase and standing waves create traps that hold tiny objects in midair, and how large arrays can shape complex acoustic fields. A live demo shows an object levitating and moving with a speaker-controlled finger, followed by a rapid demonstration of haptic feedback and sound modulation delivered through ultrasound. The session also covers turning an acoustic field into a visual hologram and the inverse problem for reproducing target shapes, with applications ranging from 3D printing to seed sorting and multimodal experiences combining shape, touch, and audio.

  • Two-speaker levitation and phase control
  • Ultrasound array enables complex traps and holographic shapes
  • Fast focal point calculation and haptic audio feedback
  • Inverse problem approach for acoustic holography and practical applications

Acoustic levitation and holography: fundamentals

The talk opens by discussing why floating objects capture imagination, then grounds the demonstration in acoustic physics. Levitation is achieved with two transducers facing each other, generating a standing wave. A small object placed between the antinodes experiences forces that trap it at a focal point. The presenter introduces the concept of stability, noting that a minimum pressure amplitude is needed for a robust trap, and that for very small objects (smaller than the wavelength) the Gor'kov potential provides a practical way to compute the forces as a sum of pressure and force terms. This frames acoustic levitation as a tractable physics problem rather than a purely magical effect.

From simple to complex fields: ultrasound arrays

Moving beyond two speakers, the speaker explores an array of about 40 kHz ultrasound transducers with independent control. By adjusting phase delays across the array, the field can be shaped to create different trapping configurations, such as twin traps and vortex-like patterns. The goal is to concentrate acoustic energy at a focal point where the object sits stably, while also shaping the surrounding pressure field to hold it in place. The analogy to pinching and grasping with fingers helps the audience visualize how multiple elements can coordinate to form a single stable focus.

Visualization, speed, and multi-modal feedback

The demonstration shows the focus pattern and how rapidly it can be updated. The focal point can be computed at extremely high frame rates, enabling the system to respond to fast finger movements and even catch objects that are released. The talk then transitions to haptic feedback: by driving the ultrasound array at high speed and modulating amplitude, tactile sensations can be delivered to the palm, creating a wind-like tickle that accompanies the levitating object. A volunteer tests the haptic feedback, reporting a ticklish sensation that feels like wind while the object remains levitating.

Inverse problem and real-time holography

The speaker then tackles the inverse problem: given a target acoustic pattern, what amplitudes and phases should the array produce? By combining a direct field calculation with a back-and-forth refinement to match the target and then re-imposing amplitude constraints, the method converges to a practical solution. A key optimization trick is to treat the amplitude as a fixed quantity and separate the phase, enabling fast updates and enabling real-time holography with multiple objects or dynamic shapes. The approach is analogous to the Gerchberg–Saxton algorithm used in optics but adapted to the acoustic domain, leveraging the ability to control both amplitude and phase at each speaker for rapid convergence.

Accounting for reflections and complex objects

Real-world objects reflect and scatter sound, complicating levitation. The team uses a boundary element approach to discretize the object into small patches and compute direct and reflected paths, updating the pattern accordingly. If the object remains static, these reflections can be precomputed, enabling efficient real-time control even with complex shapes. The result is a platform capable of levitating various materials and even simulating multi-material 3D printing fields, hinting at a future where acoustic manipulation can place or pattern materials without physical contact.

Applications and future directions

The talk closes by highlighting potential applications in medical dashboards and automotive haptics, as well as in 3D printing and seed sorting. The combination of levitation, holography, and tactile-audio feedback creates opportunities for immersive, contact-free manipulation of matter, including liquids and powders, and paves the way for multi-material printing and bio-printing. The presenter stresses the broader vision of acoustic holography as a general tool for 3D material placement in free space, controlled by precise acoustic shaping rather than mechanical contact. The session ends with an invitation for questions and further exploration of this rapidly developing field.

Key takeaways

  • Acoustic levitation relies on stable traps created by phase-controlled standing waves
  • Large ultrasound arrays enable complex, programmable trapping patterns
  • Haptic and auditory feedback can be delivered ultrasonically to create multimodal experiences
  • Inverse problem solving and real-time computation enable rapid acoustic holography

Related posts

featured
The Rest Is Science
·08/01/2026

Could Sound Make You Levitate?