# Conditional Probability Estimation

## An online discussion group on the topic Conditional Probability Estimation

### Introduction: Conditional Probability Estimation

^{12} Energy-based Models 3

Published: 2021-04-24

Tags:

References:
- Pytorch Deep Learning Lectures
- Pytorch Deep Learning Slides
- A high-bias, low-variance introduction to Machine Learning for physicists

Summary: We will discuss energy-based learning in this session.

Pages: 24

^{11} Energy-based Models 2

Published: 2021-02-27

Tags:

References:
- Pytorch Deep Learning Lectures
- Pytorch Deep Learning Slides
- A high-bias, low-variance introduction to Machine Learning for physicists

Summary: We will discuss energy-based learning in this session.

Pages: 24

^{10} Energy-based Models

Published: 2021-02-27

Tags:

References:
- Pytorch Deep Learning Lectures
- Pytorch Deep Learning Slides
- A high-bias, low-variance introduction to Machine Learning for physicists

Summary: We will discuss energy-based learning in this session.

Pages: 24

^{9} Summary of Generative Models

Published: 2021-02-27

Tags:

References:
- Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive Models

Summary:

Pages: 24

^{8} MAF: how is MADE being used

Published: 2021-02-27

Tags:

Summary: We discussed MAF (arXiv:1705.07057v4) last time: The paper did not explain how exactly is MADE being used to update the shift and logscale.
We will use the tensorflow implementation of MAF to probe the above question. Here is the link to the relevant documentation: https://www.tensorflow.org/probability/api_docs/python/tfp/bijectors/MaskedAutoregressiveFlow
Topics Refer to references.
Notes 1310.8499_notes.pdf

Pages: 24

^{7} MADE: Masked Autoencoder for Distribution Estimation

Published: 2021-02-27

Tags:

References:
- MADE: Masked Autoencoder for Distribution Estimation

Summary: Topics Refer to references.
Notes 1310.8499_notes.pdf

Pages: 24

^{6} Deep AutoRegressive Networks

Published: 2021-02-13

Tags:

References:
- Gregor, K., Danihelka, I., Mnih, A., Blundell, C., & Wierstra, D. (2014). Deep autoregressive networks. 31st International Conference on Machine Learning, ICML 2014, 4, 2991–3000.
- Autoregressive models

Summary: Topics Refer to references.
Notes 1310.8499_notes.pdf

Pages: 24

^{5} Review of Normalizing Flow

Published: 2021-01-30

Tags:

Summary: Topics Normalizing flow Applications of normalizing flow Methods of normalizing flow Problems of normalizing flow

Pages: 24

^{4} Variantional Inference Normalizing Flow

Published: 2021-01-16

Tags:

References:
- Christpher M. Bishop. (2006). Pattern Recognition and Machine Learning.
- Rezende, D. J., & Mohamed, S. (2015). Variational Inference with Normalizing Flows.

Summary: Topics Variational Inference Normalizing Flow Variational Inference with Normalizing Flows

Pages: 24

^{3} EM Methods

Published: 2021-01-02

Tags:

References:
- Trevor Hastie, Robert Tibshirani, J. F. (2004). The Elements of Statistical Learning (Vol. 99, Issue 466). Springer Science & Business Media.
- Christpher M. Bishop. (2006). Pattern Recognition and Machine Learning.

Summary: EM method, expectation-maximization algorithm, is an inspiring iterative method to find the log-likelihood by introducing some intermediate variable such as responsibility.

Pages: 24

^{2} Least Squares, Bootstrap, Maximum Likelihood, and Bayesian

Published: 2020-12-12

Tags:

Summary: Relation and difference between different methods in regression

Pages: 24

^{1} Conditional Probability and Bayes

Published: 2020-11-18

Tags:

References:
- Trevor Hastie, Robert Tibshirani, J. F. (2004). The Elements of Statistical Learning (Vol. 99, Issue 466). Springer Science & Business Media.
- Association Rules
- Bayes' theorem @ Wikipedia
- Ross, S. M. (2014). Introduction to Probability and Statistics for Engineers and Scientists. Elsevier.
- Naive Bayes

Summary: Skeleton notes for conditional probability and Bayes' theorem

Pages: 24