Telematic performances connect musicians and artists at remote locations to form a single cohesive piece. As these performances become more ubiquitous as more people have access to very high-speed Internet connections, a variety of new technologies will enable the artists and musicians to create brand new styles of works. The development of the immersive virtual environment, including Rensselaer Polytechnic Institute's own Collaborative-Research Augmented Immersive Virtual Environment Laboratory, sets the stage for these original pieces. The ability to properly spatialize sound within these environments is important for having a complete set of tools. This project uses a local installation to exemplify the techniques and protocols that make this possible. Using the visual coding environment MaxMSP as a receiving client, patches are created to parse incoming commands and coordinate information for engaging sound sources. Their spatialization is done in conjunction with the Virtual Microphone Control system, which is then mapped to loudspeakers through a patch portable to various immersive environment setups.