July 24th, 2024

Crash Course in Deep Learning (For Computer Graphics)

Jakub Boksansky discusses deep learning in computer graphics, focusing on multilayer perceptrons (MLPs). The article covers MLP structure, training, and a sample application for texture representation using HLSL/DirectX 12.

Read original articleLink Icon
Crash Course in Deep Learning (For Computer Graphics)

Jakub Boksansky, a computer graphics researcher at AMD, shares insights from his learning journey in deep learning for computer graphics. The article aims to familiarize readers with key terms and concepts in deep learning and to guide them in implementing a basic deep learning algorithm. It includes source code and a sample application in HLSL/DirectX 12. The focus is on artificial neural networks, specifically multilayer perceptrons (MLPs), which consist of interconnected neurons organized into layers. MLPs are fully connected networks where information flows in one direction, making them suitable for tasks like classification and regression. The article explains the structure of MLPs, detailing how neurons compute outputs through weighted connections, biases, and activation functions. It emphasizes the importance of training the network to adjust weights and biases for desired outputs. The author also discusses the implementation of inference, which involves calculating network outputs for given inputs after training. The sample application aims to train the MLP to represent a 2D texture by mapping UV coordinates to RGB values. The article provides a comprehensive overview of deep learning principles and practical implementation, making it a valuable resource for those interested in applying deep learning techniques in computer graphics.

Related

Linear Algebra 101 for AI/ML

Linear Algebra 101 for AI/ML

Introduction to Linear Algebra for AI/ML emphasizes basic concepts like scalars, vectors, matrices, vector/matrix operations, PyTorch basics, and mathematical notations. Simplified explanations aid beginners in understanding fundamental concepts efficiently.

ML from Scratch, Part 3: Backpropagation (2019)

ML from Scratch, Part 3: Backpropagation (2019)

The article explains backpropagation in neural networks, detailing equations, matrix operations, and activation functions. It emphasizes linear algebra and calculus, model fitting, parameter optimization, and binary cross-entropy for minimizing loss. Calculating gradients and deltas iteratively is crucial.

From the Tensor to Stable Diffusion

From the Tensor to Stable Diffusion

The GitHub repository offers a comprehensive machine learning guide covering deep learning, vision-language models, neural networks, CNNs, RNNs, and paper implementations like LeNet, AlexNet, ResNet, GRU, LSTM, CBOW, Skip-Gram, Transformer, and BERT. Ideal for exploring machine learning concepts.

Physics-Based Deep Learning Book

Physics-Based Deep Learning Book

The Physics-based Deep Learning Book (v0.2) introduces deep learning for physical simulations, covering topics like physical loss constraints, tailored training algorithms, and uncertainty modeling. It includes Jupyter notebooks for practical learning.

Show HN: I created a Neural Network from scratch, in scratch

Show HN: I created a Neural Network from scratch, in scratch

The article discusses implementing a 1-layer Feed Forward Network in Scratch for image classification. Challenges with handling multi-dimensional data were faced, but promising results were achieved with limited MNIST dataset training.

Link Icon 2 comments
By @sgt101 - 6 months
What has this got to do with computer graphics?