Difference between Normalization and Standardization? Why we do Minmax scaling and standard scaling? | Sololearn: Learn to code for FREE!
New course! Every coder should learn Generative AI!
Try a free lesson
+ 3

Difference between Normalization and Standardization? Why we do Minmax scaling and standard scaling?

Please explain me what happen in scaling?

4th Sep 2022, 7:07 AM
Ashwin Kumar
Ashwin Kumar - avatar
1 Answer
+ 1
Normalization and standardization are used to preprocess data to improve the performance of machine learning algorithms. Normalization scales data to a range between 0 and 1, while standardization scales data to have zero mean and unit variance. Minmax scaling and standard scaling are specific techniques for normalization and standardization, respectively. Minmax scaling is used when the range of the variable is important, while standard scaling is used when the mean and variance of the variable are important.
4th Apr 2023, 4:49 AM
don't be sad 😶
don't be sad 😶 - avatar