Machine Learning in Physics
lecture notes:
- machine learning fundamentals
- deep neural networks
- convolutional NNs
- recurrent neural nets
- attention and transformers
- autoencoders and GANs
- graph CNNs
- group-equivariant CNNs
- physics informed NNs
- self- and semi-supervised learning
laboratory classes:
- Preliminary problems
- simple perceptron networks
- Universal Approximation Theorem
- Colab notebook, see also updated notebook
- Image classification using MNIST dataset
- models: perceptron, deep fully-connected network, generic CNN
- overfitting, regularization, early stopping
- Colab notebook
Extra tasks: - augmentation: apply some simple geometric transformations (see e.g. here), and check if such dataset extending improves accuracy:
- use simple transformations (e.g. flip, rotate, translate, scale) using scikit-image, or open-cv,
- or use TorchVision library online during the training.
- Verify if applying flips or rotations > 45 deg improve accuracy or not, why? - generalization on wallpaper groups dataset:
- repeat the classifier training for a 2D crystallographic structures dataset;
- can we extract similarities between the classes from the confusion matrix?
- ECG signal classification
- classifiers comparison: SVM, decision trees, random forests
- feature vectors and dimensionality reduction (PCA)
- scikit-learn library
- Group equivariant CNNs
- Physics-informed NNs
- Transformer encoder or GANs?
Literature:
- KP Murphy, Probabilistic Machine Learning: An Introduction
- KP Murphy, Probabilistic Machine Learning: Advanced Topics
- MM Bronstein et al., Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges