AI Formula
Basic Conceptsβ
- Loss Function
- Gradient Descent Algorithm
- Activation Function in Deep Learning
MLP(Multi-LayerPerceptron) Neural NetworkCNNConvolutional Neural Network- Image (
RGBand Grayscale)
Data Processing
Data Dimensionless
https://github.com/fengdu78/Coursera-ML-AndrewNg-Notes
- Prompt optimization
Loss Functionβ
- Loss Function for Regression Tasks (Used for predicting continuous values)
- Loss Function for Classification Tasks (Used for predicting discrete categories)
- Loss Function for Object Detection/Segmentation Tasks
Gradient Descent Algorithmβ

Activation Functionβ

MLP Principleβ
Linear Transformation
Where:
- is weight matrix,
- is input vector,
- is bias term.
Activation Function
SoftMax omitted, converts data to probability distribution
Backpropagation
Where Ξ± is learning rate
CNN Principleβ
Includes four layers
- Convolutional Layer
- Pooling Layer
- Fully Connected Layer
- Output Layer
Other Neural Networksβ
- Recurrent Neural Network (RNN)
- Long Short-Term Memory (LSTM)
- Gated Recurrent Unit (GRU)
- Autoencoder
- Generative Adversarial Network (GAN)
- Transformer Network
- Graph Neural Network (GNN)
- Reinforcement Learning (RL)
- Attention Mechanism
https://chatgpt.com/share/67a7ee4e-d198-8009-996d-cd7cb5e11c65
Agreement
The code part of this work is licensed under Apache License 2.0 . You may freely modify and redistribute the code, and use it for commercial purposes, provided that you comply with the license. However, you are required to:
- Attribution: Retain the original author's signature and code source information in the original and derivative code.
- Preserve License: Retain the Apache 2.0 license file in the original and derivative code.
- Attribution: Give appropriate credit, provide a link to the license, and indicate if changes were made.
- NonCommercial: You may not use the material for commercial purposes. For commercial use, please contact the author.
- ShareAlike: If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.