What is the bias-variance tradeoff?
spaceto flip
Bias: error from wrong assumptions (underfitting, model too simple). Variance: error from sensitivity to training data (overfitting, model too complex). You want low bias AND low variance, but reducing one typically increases the other. The sweet spot minimizes total error. Regularization (L1, L2) helps control variance without too much bias.