What is the difference b/n const and #define | Sololearn: Learn to code for FREE!
New course! Every coder should learn Generative AI!
Try a free lesson
+ 1

What is the difference b/n const and #define

i didn't understand the difference b/n initiating the variables by using const and #define const int num = 2; and #define num 2 what is the difference b/n them

4th Sep 2018, 6:41 PM
estifanos
estifanos - avatar
2 Answers
+ 3
Defines are handled by the preprocessor. Preprocessor don't know the language and replaces all words 'num' in the code. It could be function name, class name, local variable name and others. Such replacing is dangerous and could produce compilation errors. For example, windows.h defines #min and #max that adds compilation errors to std::min() and std::max(). Constants are handled by the compiler and they're part of the program. They have a type, address and visibility. Using constants is much more safe than defines.
4th Sep 2018, 7:54 PM
Sergey Ushakov
Sergey Ushakov - avatar
+ 3
#define is handled by the preprocessor, not the compiler. Before compiling, all occurences of num are replaced by 2, then the result source is compiled. const is part of the language itself, and const num ends up a a one place in memory where the value of num is stored.
4th Sep 2018, 7:06 PM
ifl
ifl - avatar