Machine Learning With Javascript

About This Course

This is a free Course.

Please don’t forget to support our causes by helping us fund our free ICT training classes and befriending events for the elderly.

You can learn more about ways to support our cause Here.

What I will learn?

  • Machine Learning With Javascript

Course Curriculum

Machine Learning With Javascript

  • 1. Getting Started – How to Get Help.
    00:58
  • 2. Solving Machine Learning Problems.
    06:04
  • 3. A Complete Walkthrough.
    09:54
  • 4. App Setup.
    02:01
  • 5. Problem Outline.mp4 download
    02:53
  • 6. Identifying Relevant Data.
    04:11
  • 7. Dataset Structures.
    05:47
  • 8. Recording Observation Data.
    03:60
  • 9. What Type of Problem.mp4 download
    04:36
  • 1. Introducing Logistic Regression.
    02:28
  • 10. Encoding Label Values.
    04:18
  • 11. Updating Linear Regression for Logistic Regression.
    07:08
  • 12. The Sigmoid Equation with Logistic Regression.
    00:00
  • 13. A Touch More Refactoring.
    07:46
  • 14. Gauging Classification Accuracy.
    03:27
  • 15. Implementing a Test Function.
    05:16
  • 16. Variable Decision Boundaries.
    07:16
  • 17. Mean Squared Error vs Cross Entropy.
    05:45
  • 18. Refactoring with Cross Entropy.
    05:08
  • 19. Finishing the Cost Refactor.
    04:36
  • 2. Logistic Regression in Action.
    06:31
  • 20. Plotting Changing Cost History.
    03:24
  • 3. Bad Equation Fits.
    05:31
  • 4. The Sigmoid Equation.
    04:31
  • 5. Decision Boundaries.
    07:47
  • 6. Changes for Logistic Regression.
    01:12
  • 7. Project Setup for Logistic Regression.
    05:51
  • 9. Importing Vehicle Data.
    04:27
  • 1. Multinominal Logistic Regression.
    02:19
  • 10. Sigmoid vs Softmax.
    06:08
  • 11. Refactoring Sigmoid to Softmax.
    00:00
  • 12. Implementing Accuracy Gauges.
    02:36
  • 13. Calculating Accuracy.
    03:15
  • 2. A Smart Refactor to Multinominal Analysis.
    05:07
  • 3. A Smarter Refactor!.
    00:00
  • 4. A Single Instance Approach.
    09:50
  • 5. Refactoring to Multi-Column Weights.
    04:40
  • 6. A Problem to Test Multinominal Classification.
    04:37
  • 7. Classifying Continuous Values.
    04:41
  • 8. Training a Multinominal Model.
    06:19
  • 9. Marginal vs Conditional Probability.
    09:56
  • 1. Handwriting Recognition.
    02:10
  • 10. Backfilling Variance.
    02:36
  • 2. Greyscale Values.
    05:11
  • 3. Many Features.
    03:29
  • 4. Flattening Image Data.
    06:06
  • 5. Encoding Label Values.
    05:44
  • 6. Implementing an Accuracy Gauge.
    07:26
  • 7. Unchanging Accuracy.
    01:55
  • 8. Debugging the Calculation Process.
    08:13
  • 9. Dealing with Zero Variances.
    06:15
  • 1. Handing Large Datasets.
    04:14
  • 10. Tensorflow’s Eager Memory Usage
    04:40
  • 11. Cleaning up Tensors with Tidy.
    02:48
  • 12. Implementing TF Tidy.
    03:31
  • 13. Tidying the Training Loop.
    03:58
  • 14. Measuring Reduced Memory Usage.
    00:00
  • 15. One More Optimization.
    02:35
  • 16. Final Memory Report.
    02:45
  • 17. Plotting Cost History.
    00:00
  • 18. NaN in Cost History.
    00:00
  • 19. Fixing Cost History.
    04:45
  • 2. Minimizing Memory Usage.
    00:00
  • 20. Massaging Learning Parameters.
    00:00
  • 21. Improving Model Accuracy.
    00:00
  • 3. Creating Memory Snapshots.
    05:14
  • 4. The Javascript Garbage Collector.
    00:00
  • 5. Shallow vs Retained Memory Usage.
    00:00
  • 6. Measuring Memory Usage.
    08:29
  • 7. Releasing References.
    03:14
  • 8. Measuring Footprint Reduction.
    00:00
  • 9. Optimization Tensorflow Memory Usage.
    01:31
  • 1. Loading CSV Files.
    00:00
  • 10. Splitting Test and Training.
    07:44
  • 2. A Test Dataset.
    02:00
  • 3. Reading Files from Disk.
    03:08
  • 4. Splitting into Columns.
    02:54
  • 5. Dropping Trailing Columns.
    02:30
  • 6. Parsing Number Values.
    00:00
  • 7. Custom Value Parsing.
    04:20
  • 8. Extracting Data Columns.
    00:00
  • 9. Shuffling Data via Seed Phrase.
    00:00
  • 1. How K-Nearest Neighbor Works.
    00:00
  • 10. Gauging Accuracy.
    00:00
  • 11. Printing a Report.
    03:30
  • 12. Refactoring Accuracy Reporting.
    00:00
  • 13. Investigating Optimal K Values.
    11:38
  • 14. Updating KNN for Multiple Features.
    00:00
  • 15. Multi-Dimensional KNN.
    03:56
  • 16. N-Dimension Distance.
    09:50
  • 17. Arbitrary Feature Spaces.
    08:27
  • 18. Magnitude Offsets in Features.
    05:36
  • 19. Feature Normalization.
    07:32
  • 2. Lodash Review.
    09:56
  • 20. Normalization with MinMax.
    07:14
  • 21. Applying Normalization.
    04:22
  • 22. Feature Selection with KNN
    07:47
  • 23. Objective Feature Picking.
    06:10
  • 24. Evaluating Different Feature Values.
    02:53
  • 3. Implementing KNN.
    07:16
  • 4. Finishing KNN Implementation.
    00:00
  • 5. Testing the Algorithm.
    04:47
  • 6. Interpreting Bad Results.
    04:12
  • 7. Test and Training Data.
    04:05
  • 8. Randomizing Test Data.
    03:48
  • 9. Generalizing KNN.
    03:41
  • 1. Let’s Get Our Bearings.
    07:27
  • 10. Creating Slices of Data.
    07:46
  • 11. Tensor Concatenation.
    05:28
  • 12. Summing Values Along an Axis.
    05:13
  • 13. Massaging Dimensions with ExpandDims.
    07:47
  • 2. A Plan to Move Forward.
    04:31
  • 3. Tensor Shape and Dimension.
    10:02
  • 5. Elementwise Operations.
    08:19
  • 6. Broadcasting Operations.
    06:47
  • 8. Logging Tensor Data.
    03:47
  • 9. Tensor Accessors.
    05:24
  • 1. KNN with Regression.
    04:56
  • 10. Reporting Error Percentages.
    06:26
  • 11. Normalization or Standardization.
    07:33
  • 12. Numerical Standardization with Tensorflow.
    07:37
  • 13. Applying Standardization.
    04:01
  • 14. Debugging Calculations.
    08:14
  • 15. What Now.
    00:00
  • 2. A Change in Data Structure.
    04:04
  • 3. KNN with Tensorflow.
    09:18
  • 4. Maintaining Order Relationships.
    06:30
  • 5. Sorting Tensors.
    08:00
  • 6. Averaging Top Values.
    07:44
  • 7. Moving to the Editor.
    03:26
  • 8. Loading CSV Data.
    10:10
  • 9. Running an Analysis
    06:10
  • 1. Linear Regression.
    02:40
  • 10. Answering Common Questions.
    03:48
  • 11. Gradient Descent with Multiple Terms.
    04:43
  • 12. Multiple Terms in Action.
    10:39
  • 2. Why Linear Regression.
    04:52
  • 3. Understanding Gradient Descent.
    13:04
  • 4. Guessing Coefficients with MSE.
    10:20
  • 5. Observations Around MSE.
    05:57
  • 6. Derivatives!.
    07:12
  • 7. Gradient Descent in Action.
    11:46
  • 8. Quick Breather and Review.
    05:46
  • 9. Why a Learning Rate.
    17:05
  • 1. Project Overview.
    06:01
  • 10. More on Matrix Multiplication.
    06:40
  • 11. Matrix Form of Slope Equations
    06:22
  • 12. Simplification with Matrix Multiplication.
    09:28
  • 13. How it All Works Together!.
    1414:02
  • 2. Data Loading.
    00:00
  • 3. Default Algorithm Options.
    08:32
  • 4. Formulating the Training Loop.
    03:18
  • 5. Initial Gradient Descent Implementation.
    09:24
  • 6. Calculating MSE Slopes.
    06:52
  • 7. Updating Coefficients.
    03:12
  • 8. Interpreting Results.
    10:07
  • 9. Matrix Multiplication.
    00:00
  • 1. Refactoring the Linear Regression Class.
    07:40
  • 10. Reapplying Standardization.
    05:57
  • 11. Fixing Standardization Issues.
    05:36
  • 12. Massaging Learning Rates.
    03:15
  • 13. Moving Towards Multivariate Regression.
    11:44
  • 14. Refactoring for Multivariate Analysis.
    07:28
  • 15. Learning Rate Optimization.
    08:04
  • 16. Recording MSE History.
    05:21
  • 17. Updating Learning Rate.
    06:41
  • 2. Refactoring to One Equation.
    08:58
  • 3. A Few More Changes.
    06:13
  • 4. Same Results Or Not.
    03:19
  • Draft 5. Calculating Model Accuracy
    08:37
  • 6. Implementing Coefficient of Determination.
    07:44
  • 7. Dealing with Bad Accuracy.
    07:48
  • 8. Reminder on Standardization.
    04:36
  • 9. Data Processing in a Helper Method.
    03:38
  • 1. Observing Changing Learning Rate and MSE.
    04:18
  • 2. Refactoring Towards Batch Gradient Descent.
    05:21
  • 3. Plotting MSE History against B Values.
    04:22
  • 1. Batch and Stochastic Gradient Descent.
    07:17
  • 2. Refactoring Towards Batch Gradient Descent.
    05:06
  • 3. Determining Batch Size and Quantity.
    06:02
  • 4. Iterating Over Batches.
    07:48
  • 5. Evaluating Batch Gradient Descent Results.
    05:41
  • 6. Making Predictions with the Model.
    07:37
Free
Free
Free access this course

Course info:

Categories Free Courses

Target Audience

  • This Course is aimed at anyone who wants to learn.