A common challenge for audiologists when fitting hearing aids is that quiet clinics are unlike the noisy, reverberant places where patients report difficulty. This prevents direct user feedback about hearing aid performance. Virtual acoustics can allow a patient to experience a hearing aid in such environments, without leaving the clinic. A virtual reality (VR) audio-visual demonstration has been created which “test drives” hearing aid features in real-world scenarios, dynamically rendered over custom wired hearing aids and headphones. The scenes were created from 360° photographs and acoustic measurements made in real rooms. A room acoustic model was used for the simulation, tuned to match the measured data. Audio was rendered using principal component-base amplitude panning (PCBAP), which can provide dynamic VR audio with broadband magnitude and phase accuracy. This talk will provide comparisons between PCBAP and higher-order Ambisonics (HOA) for rendering hearing aid beamformers in anechoic and reverberant environments. Results show that PCBAP can render a beamformer’s directivity broadband with 2 to 3 dB error (95%) using only 36 filters in both anechoic and reverberant conditions. Even seventh-order HOA (64 filters) does not achieve similar performance. The talk will also include videos of the demonstration, highlighting viability for clinical applications.