19.07.2020 — Neural Network/DL

Pradeep Ankem
1 min readJul 19, 2020

--

Expectations / Hopes

Beginning Thoughts

  1. Solve more complex issues
  2. Real world DL apps → Image Processing
  3. Ears → Alexa, Siri
  4. Vision → Google Photos / Google Lens/ Office Lens/ Amazon/ OCR
  5. Speech → Alexa, Siri, Google Home, Cortana, Google’s Transcribe
  6. Text/Write → Google search (NLP)/ Sentiment Analysis/ Transformer/ Chatbots

Libraries — Scikit Learn, TensorFlow, PyTorch and Keras API

Cognitive Services

3 major reasons for DL popularity

  1. Availability of big Data
  2. Types of Data/ Sources
  3. Cloud Computing/Speed of computation → GPU (Google Colab/Kaggle/IBM Watson ?)
  4. Accessibility to Algorithms
Neural Network (ANN / CNN)inspired by Brain1. Input layer
2. Output layer
3. Hidden layer
4. Loss layer
0,0 then 0
1,0 then 1
1,1 then 1
1,0 then 1
4 inputs, and 2 outputs

5. What are the keywords

Activation Functions 
Network Topology: Nodes and Links(aka Edges)
Weights
Threshold
Error or Loss Function
Forward Propagation/Backward Propagation
Gradient Descent
Hyper parameters
Optimitizers
Choice of Distance Metric
Learning Rate
Black Box

6. How to decide on which Optimizier/ How to design Learning Rate/ Data exploration on MNIST/ Which Activation function to choose/ TensorBoard Demo/ Why not PyTorch ?

--

--

Pradeep Ankem
Pradeep Ankem

Written by Pradeep Ankem

In Parallel Universe, I would have been a Zen Monk.

No responses yet