QUASAR Client

Quad-based Adaptive Streaming And Rendering

Table of contents

Overview

QUASAR is a remote rendering system that represents scene views using pixel-aligned quads, enabling temporally consistent and bandwidth-adaptive streaming for high-quality, real-time visualization for thin clients.

Our repository below provides baseline implementations of remote rendering systems designed to support and accelerate research in the field. For detailed discussion on the design principles, implementation details, and benchmarks, please see our paper:

QUASAR: Quad-based Adaptive Streaming And Rendering
Edward Lu and Anthony Rowe
ACM Transactions on Graphics 44(4) (proc. SIGGRAPH 2025)

Paper: https://quasar-gfx.github.io/assets/quasar_siggraph_2025.pdf
GitHub (Main): https://github.com/quasar-gfx/QUASAR
GitHub (Client): https://github.com/quasar-gfx/QUASAR-client

QUASAR system diagram.

Install Dependencies

To download the OpenXR client code, clone the repository at https://github.com/quasar-gfx/QUASAR-client using git:

git clone --recursive git@github.com:quasar-gfx/QUASAR-client.git

If you accidentally cloned the repository without --recursive-submodules, you can do:

git submodule update --init --recursive

Note: The repository code has been tested on Meta Quest 2, Meta Quest Pro, and Meta Quest 3.

Building and Running

The following steps assume you have the latest Android Studio installed on your computer:

  1. Connect your headset to your host machine. The headset can either be connected using a cable or wirelessly (see Debugging).
  2. If this is the first time you are launching Android Studio, select Open an existing Android Studio project. If you have launched Android Studio before, click File > Open instead.
  3. Open QuestClientApps/build.gradle in the QUASAR-client repository in Android Studio, select an app in the Configurations menu at the top (dropdown to the left of Run button), and click the Run button to build and upload (first time opening and building may take a while).

Sample Apps

All apps allow you to move through a scene using the controller joysticks. You will move in the direction you are looking.

SceneViewer

The Scene Viewer app loads a GLTF/GLB scene on the headset to view (you can download example scenes from this link).

Download and unzip into QuestClientApps/Apps/SceneViewer/assets/models/scenes/ (this will be gitignored).

Note: You can only have ONE glb in the scenes/ directory at once since Android will run out of storage if you have them all. So, just have RobotLab.glb in scenes/.

ATWClient

The ATW Client app allows the headset to act as a receiver for a stereo video stream from the server sent across a network. The headset will reproject each eye using a homography to warp the images along a plane.

First, open QuestClientApps/Apps/ATWClient/include/ATWClient.h and change:

std::string serverIP = "192.168.4.140";

to your server's IP address. Then, on the QUASAR repository, build and run:

# in build directory
cd apps/atw/streamer
./atw_streamer --size 2048x1024 --scene ../assets/scenes/robot_lab.json --video-url <headset's IP>:12345 --vr 1

To get your headset's IP address, run:

adb shell ip addr show wlan0  # look for address after 'inet'

in a terminal with your headset connected. Make sure you have adb installed.

MeshWarpViewer

The MeshWarp Viewer app will load a saved static frame from MeshWarp to view on the headset. Simply run the app on the headset to view!

MeshWarpClient

The MeshWarp Client app allows the headset to act as a receiver for a video and depth stream from the server sent across a network. The headset will reconstruct a mesh using the depth stream and texture map it with the video for reprojection.

First, open QuestClientApps/Apps/MeshWarpClient/include/MeshWarpClient.h and change:

std::string serverIP = "192.168.4.140";

Then, on the QUASAR repository, build and run:

# in build directory
cd apps/meshwarp/streamer
./mw_streamer --size 1920x1080 --scene ../assets/scenes/robot_lab.json --video-url <headset's IP>:12345 --depth-url <headset's IP>:65432 --depth-factor 4

QuadsViewer

The Quads Viewer app will load a saved static frame from QuadWarp to view on the headset. You can change the scene by editing:

std::string sceneName = "robot_lab"; // choose from robot_lab, sun_temple, viking_village, or san_miguel

in QuestClientApps/Apps/QuadsViewer/include/QuadsViewer.h.

QuadStreamViewer

The QuadStream Viewer app will load a saved static frame from QuadStream to view on the headset. You can change the scene by editing:

std::string sceneName = "robot_lab"; // choose from robot_lab, sun_temple, viking_village, or san_miguel

in QuestClientApps/Apps/QuadStreamViewer/include/QuadStreamViewer.h.

QUASARViewer

The QUASAR Viewer app will load a saved static frame from QUASAR to view on the headset. You can change the scene by editing:

std::string sceneName = "robot_lab"; // choose from robot_lab, sun_temple, viking_village, or san_miguel

in QuestClientApps/Apps/QUASARViewer/include/QUASARViewer.h.

Debugging

To wirelessly connect to your headset, type this into your terminal with your headset plugged in:

adb tcpip 5555
adb connect <headset's ip address>:5555

Now, you don't need to plug in your headset to upload code!

To debug/view print statements, see the Logcat tab on Android Studio if the headset is connected.

Credit

A majority of the OpenXR code is based on the OpenXR Android OpenGL ES tutorial.

Citation

If you find this project helpful for any research-related purposes, please consider citing our paper:

@article{lu2025quasar,
    title={QUASAR: Quad-based Adaptive Streaming And Rendering},
    author={Lu, Edward and Rowe, Anthony},
    journal={ACM Transactions on Graphics (TOG)},
    volume={44},
    number={4},
    year={2025},
    publisher={ACM New York, NY, USA},
    url={https://doi.org/10.1145/3731213},
    doi={10.1145/3731213},
}

Acknowledgments

We gratefully acknowledge the authors of QuadStream and PVHV for their foundational ideas, which served as valuable inspiration for our work.

This work was supported in part by the NSF under Grant No. CNS1956095, the NSF Graduate Research Fellowship under Grant No. DGE2140739, and Bosch Research.

Special thanks to Ziyue Li and Ruiyang Dai for helping on the implementation!

This webpage is adapted from nvdiffrast. We sincerely appreciate the authors for open-sourcing their code.