19.07.2020 — Neural Network/DL
1 min readJul 19, 2020
Expectations / Hopes
Beginning Thoughts
- Solve more complex issues
- Real world DL apps → Image Processing
- Ears → Alexa, Siri
- Vision → Google Photos / Google Lens/ Office Lens/ Amazon/ OCR
- Speech → Alexa, Siri, Google Home, Cortana, Google’s Transcribe
- Text/Write → Google search (NLP)/ Sentiment Analysis/ Transformer/ Chatbots
Libraries — Scikit Learn, TensorFlow, PyTorch and Keras API
Cognitive Services
3 major reasons for DL popularity
- Availability of big Data
- Types of Data/ Sources
- Cloud Computing/Speed of computation → GPU (Google Colab/Kaggle/IBM Watson ?)
- Accessibility to Algorithms
Neural Network (ANN / CNN)inspired by Brain1. Input layer
2. Output layer
3. Hidden layer
4. Loss layer0,0 then 0
1,0 then 1
1,1 then 1
1,0 then 14 inputs, and 2 outputs
5. What are the keywords
Activation Functions
Network Topology: Nodes and Links(aka Edges)
Weights
Threshold
Error or Loss Function
Forward Propagation/Backward Propagation
Gradient Descent
Hyper parameters
Optimitizers
Choice of Distance Metric
Learning Rate
Black Box
6. How to decide on which Optimizier/ How to design Learning Rate/ Data exploration on MNIST/ Which Activation function to choose/ TensorBoard Demo/ Why not PyTorch ?