alternative
  • Home (current)
  • About
  • Tutorial
    Technologies
    C#
    Deep Learning
    Statistics for AIML
    Natural Language Processing
    Machine Learning
    SQL -Structured Query Language
    Python
    Ethical Hacking
    Placement Preparation
    Quantitative Aptitude
    View All Tutorial
  • Quiz
    C#
    SQL -Structured Query Language
    Quantitative Aptitude
    Java
    View All Quiz Course
  • Q & A
    C#
    Quantitative Aptitude
    Java
    View All Q & A course
  • Programs
  • Articles
    Identity And Access Management
    Artificial Intelligence & Machine Learning Project
    How to publish your local website on github pages with a custom domain name?
    How to download and install Xampp on Window Operating System ?
    How To Download And Install MySql Workbench
    How to install Pycharm ?
    How to install Python ?
    How to download and install Visual Studio IDE taking an example of C# (C Sharp)
    View All Post
  • Tools
    Program Compiler
    Sql Compiler
    Replace Multiple Text
    Meta Data From Multiple Url
  • Contact
  • User
    Login
    Register

Machine Learning - Supervised Learning - Bias Variance Trade Off Tutorial

Bias – The inability of machine learning model to truly capture the relationship between training data and learning line. If the difference between the predicted value and actual value in training data is more then it is highly biased, else it is less biased.

https://miro.medium.com/max/700/1*ABijNe3CESw-o5k4YcahHg.png

Variance  - Spread of our data is called the variance of the model. The model with high variance has a best fit on training data(overfit) and thus is not able to fit accurately on the test data. As a result, such models perform very well on training data but has high error rates on test data.

 

https://miro.medium.com/max/700/1*J1NNmV8kaPzeRY5b3YXmvw.png

https://miro.medium.com/max/700/1*4w8a8nztaF_v4k_D-IvHAw.png

 

https://towardsdatascience.com/the-relationship-between-bias-variance-overfitting-generalisation-in-machine-learning-models-fb78614a3f1e 

 

Overfit - Overfitting occurs when our machine learning model tries to cover all the data points or more than the required data points of the training dataset. The overfitted model has low bias and high variance.

 

Underfit - Underfitting occurs when our machine learning model is not able to capture the underlying trend of the data. The underfitted model has high bias and low variance.

 

 

Bias-Variance trade-off

Bias-Variance trade-off is about finding the sweet spot to make a balance between bias and variance errors.

 

While building the machine learning model, it is really important to take care of bias and variance in order to avoid overfitting and underfitting in the model. If the model is very simple with fewer parameters, it may have low variance and high bias. Whereas, if the model has a large number of parameters, it will have high variance and low bias. So, it is required to make a balance between bias and variance errors(low bias and low variance), and this balance between the bias error and variance error is known as the Bias-Variance trade-off.

https://www.javatpoint.com/bias-and-variance-in-machine-learning 



 

Total Error

To build a good model, we need to find a good balance between bias and variance such that it minimizes the total error.

https://miro.medium.com/max/882/1*SKHGhoGKnBh_GPGHI2Ktvw.png

Bias and Variance in Machine Learning

 

To find the Bias variance trade off or to reduce overfitting, three method is used i.e regularization, bagging and boosting.

 

Machine Learning

Machine Learning

  • Introduction
  • Overview
    • Type Of Machine Learning
    • Batch Vs Online Machine Learning
    • Instance Vs Model Based Learning
    • Challenges in Machine Learning
    • Machine Learning Development Life Cycle
  • Machine Learning Development Life Cycle
    • Framing the Problem
    • Data Gathering
    • Understanding your Data
    • Exploratory Data Analysis (EDA)
    • Feature Engineering
    • Principal Component Analysis
    • Column Transformer
    • Machine Learning Pipelines
    • Mathematical Transformation
    • Binning and Binarization | Discretization | Quantile Binning | KMeans Binning
  • Supervised Learning
    • Overview
    • Linear Regression [Regression]
    • Multiple Linear Regression
    • Polynomial Linear Regression [Regression]
    • Bias Variance Trade Off
    • Regularization
    • LOGISTIC REGRESSION [Regression & Classification]
    • Polynomial Logistic Regression
    • Support Vector Machines / Support Vector Regressor
    • Naïve Bayes Classifier [classification]
    • Decision Tree
    • Entropy
    • Information Gain
    • K Nearest Neighbor (KNN)
    • Neural Network (MultiLayer Perceptron)
  • Ensemble Learning
    • Introduction to Ensemble Learning
    • Basic Ensemble Techniques
    • Advanced Ensemble Techniques
    • Random Forest Classifier
    • Boosting
  • UnSupervised Learning
    • Overview
    • K Mean Clustering

About Fresherbell

Best learning portal that provides you great learning experience of various technologies with modern compilation tools and technique

Important Links

Don't hesitate to give us a call or send us a contact form message

Terms & Conditions
Privacy Policy
Contact Us

Social Media

© Untitled. All rights reserved. Demo Images: Unsplash. Design: HTML5 UP.