A list of awesome resources on normalizing flows. https://github.com/janosh/awesome-normalizing-flows 可以收藏学习下

A PyTorch Implementation of Density Estimation Using Real NVP https://github.com/xqding/RealNVP

A collection of resources regarding the interplay between differential equations, dynamical systems, deep learning, control and optimization. https://github.com/Zymrael/awesome-neural-ode

DeeProb-kit is a unified library written in Python consisting of a collection of deep probabilistic models (DPMs) that are tractable and exact representations for the modelled probability distributions. https://github.com/deeprob-org/deeprob-kit

Normalising flows:

D. Jimenez Rezende and S. Mohamed, Variational Inference with Normalizing Flows, arXiv e-prints , arXiv:1505.05770 (2015), arXiv:1505.05770 [stat.ML].

For a more detailed description of normalising flows, the different types and their applica- tion we point the reader to [31, 32].

[31] I. Kobyzev, S. J. D. Prince, and M. A. Brubaker, Normalizing Flows: An Introduction and Review of Cur- rent Methods, arXiv e-prints , arXiv:1908.09257 (2019), arXiv:1908.09257 [stat.ML]. [32] G. Papamakarios, E. Nalisnick, D. Jimenez Rezende, S. Mohamed, and B. Lakshminarayanan, Normalizing Flows for Probabilistic Modeling and Inference, arXiv e-prints , arXiv:1912.02762 (2019), arXiv:1912.02762 [stat.ML].

Normalising flows generally fall in two categories depending on how the mappings are defined: autoregressive flows [38, 39] and flows based on coupling transforms [40– 42].

[38] G. Papamakarios, T. Pavlakou, and I. Murray, Masked Autoregressive Flow for Density Estimation, arXiv e-prints , arXiv:1705.07057 (2017), arXiv:1705.07057 [stat.ML]. [39] C.-W. Huang, D. Krueger, A. Lacoste, and A. Courville, Neural Autoregressive Flows, arXiv e-prints , arXiv:1804.00779 (2018), arXiv:1804.00779 [cs.LG]. [40] L. Dinh, J. Sohl-Dickstein, and S. Bengio, Density estimation using Real NVP, arXiv e-prints , arXiv:1605.08803 (2016), arXiv:1605.08803 [cs.LG]. [41] D. P. Kingma and P. Dhariwal, Glow: Generative Flow with Invertible 1x1 Convolutions, arXiv e-prints , arXiv:1807.03039 (2018), arXiv:1807.03039 [stat.ML]. [42] C. Durkan, A. Bekasov, I. Murray, and G. Papamakarios, Neural Spline Flows, arXiv e-prints , arXiv:1906.04032 (2019), arXiv:1906.04032 [stat.ML].

More recently this has been generalised to linear transforms in which the permutation is learnt during training [41].

[41] D. P. Kingma and P. Dhariwal, Glow: Generative Flow with Invertible 1x1 Convolutions, arXiv e-prints , arXiv:1807.03039 (2018), arXiv:1807.03039 [stat.ML].

如何评价Normalizing Flow/Invertible Networks? https://www.zhihu.com/question/376122890

Boltzmann Generators and Normalizing Flows in PyTorch: https://github.com/noegroup/bgflow A comparison of commonly used deep generative models. https://arxiv.org/pdf/2106.00792.pdf

How the Flow learn the posterior: https://arxiv.org/pdf/1605.06376.pdf PRIORS IN BAYESIAN DEEP LEARNING: A REVIEW: https://arxiv.org/pdf/2105.06868.pdf

Zhixiao 看好的 flow: FloWaveNet: https://arxiv.org/abs/1811.02155 Parallel WaveNet: http://proceedings.mlr.press/v80/oord18a.html To ensure both efficient density estimation and sampling, van den Oord et al. [2017] proposed an approach called Probability Density Distillation which trains the flow f as normal and then uses this as a teacher network to train a tractable student network g. FOURIER FLOWS: https://openreview.net/pdf/7aad31a541edaadc936aa88af4d48ddc836b7344.pdf 澳洲某人和我们在做类似的事情:https://github.com/tanghyd/gravflows https://wangleiphy.github.io/lectures/PILtutorial.pdf + https://wangleiphy.github.io/lectures/Flow.pdf

Codes/Tutorial: https://github.com/acids-ircam/pytorch_flows https://github.com/papercup-open-source/tutorials https://www.youtube.com/watch?v=u3vVyFVU_lI https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial11/NF_image_modeling.html https://github.com/VincentStimper/normalizing-flows https://github.com/desResLab/LINFA

【标准化流及变体的PyTorch实现】’Normalizing Flows by PyTorch - PyTorch implementations of normalizing flow and its variants.’ by Tatsuya Yatagawa GitHub: https://github.com/tatsy/normalizing-flows-pytorchgit

sbi: simulation-based inference https://github.com/mackelab/sbi sbi is a PyTorch package for simulation-based inference. Simulation-based inference is the process of finding parameters of a simulator from observations.

simulation-based inference algorithm TMNRE (Truncated Marginal Neural Ratio Estimation) https://arxiv.org/abs/2107.01214

OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal Transport https://github.com/EmoryMLIP/OT-Flow https://arxiv.org/pdf/2006.00104.pdf

https://github.com/francois-rozet/zuko

He Wang
He Wang
PostDoc

Knowledge increases by sharing but not by saving.

Next
Previous