What I've been reading:

  • Recurrent Neural Networks (RNNs): A Gentle Introduction and Overview

    What I got from this:
    This is by far the most in clear explanation of Back propigation through time that I have read. It does a great job outlining the vanishing and exploding gradient problems, and explaning how LSTMs reduce the risk of the former. All this while keeping the explanatio concice
    This paper also does a good job at giving solid and concice overviews on popular RNN versions and advancements. Deep RNNs, Bidirectional RNNs, LSTMs, Encoder-Decoder and seq2seq, Transformers, and Pointer networks.
    The sections on Auto-Encoders and Transformers were the most benifical to me. With all the work being done in the DaSc world regarding Transformers, it's alwasy great to make sure I have a solid base for the low level mechanics and funtionalty of these models.

    Sequence Modeling: Recurrentand Recursive Nets

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    A Gentle Tutorial of Recurrent Neural Network with Error Backpropagation

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Long Short-Term Memory

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Attention Is All You Need

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Auto-Encoder: What Is It? And What Is It Used For?

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    5 Ways to Detect Outliers/Anomalies That Every Data Scientist Should Know

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Random Forest in Python

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Convolutional Neural Networks Explained

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    The Unreasonable Effectiveness of Recurrent Neural Networks

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    LSTMs Explained: A Complete, Technically Accurate, Conceptual Guide with Keras

    TODO: ~~~Understand why this is an important improvement over RNNs~~~ Here is a summary from what I read. To see more on what I think, read my blog post here:

    DCGAN, cGAN and SAGAN & the CIFAR-10 dataset

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Introduction to Diffusion Models for Machine Learning

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    An Overview of ResNet and its Variants

    ~~~ understand why these are important in terms of information flow. Relay it to GANs. ~~~TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    The Illustrated Transformer

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Solving Math Word Problems

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Summarizing Books with Human Feedback

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Multimodal Neurons in Artificial Neural Networks

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    AI and Efficiency

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Deep Double Descent

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Generative Modeling with Sparse Transformers

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Dreamento: An open-source dream engineering toolbox utilizing sleep wearable

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Anomaly Detection with Machine Learning

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Diffusion Models for Video Modeling

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here:

    Title

    TODO: Here is a summary from what I read. To see more on what I think, read my blog post here: