par Zdybal, Kamila ;Parente, Alessandro ;Sutherland, James C.
Référence
Publication Publié, 2023-02-27
Référence
Publication Publié, 2023-02-27
Abstract de conférence
Résumé : | The first step in reduced-order modeling (ROM) workflows is finding a low-dimensional representation of a highly-dimensional system. The second step of ROM often requires training a nonlinear regression model to predict physical quantities of interest from the reduced representation. Much of the research on training ROMs thus far has tackled those two steps separately. While they both come with their challenges, a good-quality low-dimensional system representation usually facilitates building a regression model. In this work, we leverage this inherent link between dimensionality reduction and nonlinear regression. We propose an approach where dimensionality reduction and nonlinear regression are considered jointly within an autoencoder-like neural-network architecture. The dimensionality reduction (encoding) is affected by forcing accurate regression (decoding) of the quantities of interest. We show that such a joint architecture leads to improved low-dimensional representations as the two steps communicate with each other through backpropagation. We further incorporate Bayesian optimization of the autoencoder hyper-parameters using a recently proposed objective function that quantifies the quality of low-dimensional data representations. We apply our regression-aware autoencoder on test cases coming from reacting flow systems whose original dimensionalities range from 10 to 50 and are efficiently reduced to 2 or 3. The relevant quantities of interest are the important state variables, such as temperature and major chemical species, and highly nonlinear source terms required by the reduced model. The proposed approach can serve as an effective replacement of standalone dimensionality reduction techniques whenever nonlinear regression is anticipated in the downstream use. |