New course! Every coder should learn Generative AI!
Try a free lesson+ 6
2, why? Because an integer is any whole number, and as we learned in school a whole number is any number that has no decimal or fraction with it, so in programming an integer has the least amount of precision due to no decimals,
Precision compare from lowest to highest on basic data types
Int<float<double
so when we call int(2.5) its cast to an int and the precision(decimal numbers) are chopped off.