AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Thank you anytime12/7/2023 ![]() ![]() Using an autoregressive model in this latent space, we trade off sample quality for computational efficiency by truncating the generation process before decoding into the original data space. Inspired by Principal Component Analysis, we learn a structured representation space where dimensions are ordered based on their importance with respect to reconstruction. To address this difficulty, we propose a new family of autoregressive models that enables anytime sampling. This challenge impedes the deployment of powerful autoregressive models, which involve a slow sampling process that is sequential in nature and typically scales linearly with respect to the data dimension. The sampling process of these models, however, does not allow interruptions and cannot adapt to real-time computational resources.
0 Comments
Read More
Leave a Reply. |