char to int | SoloLearn: Learn to code for FREE!

+2

char to int

can anyone share your knowledge to me for this, i'm declaring a decimal numbers by using the "char" data type. and also the output must also be the same. instead of characters, it will still return decimal values. ex: Input: 1234 (but the data type used was char) Output: 1234

c

10/3/2017 1:14:53 PM

Clark Justine L. Gonzales

21 Answers

New Answer

+5

Is char type capable for accepting 1234 as a value? isn't it limited to 255? it's only one byte (8 bits) data, are you sure, or am I misunderstanding the question?

+4

@Kirk Schafer, thanks, that's an interesting point, certainly a char combination will accept the value because together they form 16 bit data which can take 1234 value. I didn't know the formula for combining byte value as you explained, it was great!. Another interesting point, so I tested the fact you pointed out, and write a text file with 'aa', and view the content in hex editor I see 61 61, which, combined as 16 bit value makes up 0x6161, the hexadecimal representation of 24,929 just like your example. I really appreciate your enlightenment, thank you once again :)

+4

@~ swim ~, that's exactly true, however the question did not tag Java, it tagged C instead, and in C/C++ a char is one byte, I was questioning the fact of how a single byte (char) can hold a value as large as 1234. In Java a char is defined as two bytes because Java is Unicode compliant, in Unicode standard a single character indeed consists of two bytes so that non latin characters can be handled correctly. Thank you for the explanation mate, there's always something new to learn everyday, I really appreciate it :)

+4

@~ swim ~, I'll keep that wchar_t in mind, thanks for the advice mate ;-)

+4

@~ swim ~, Thanks for more info on the Unicode string usage, actually I haven't got that far, but I'm gonna be keeping those in my library for later use. Big thanks mate :)

+4

@~ swim ~, Now that's overwhelming, okay, lemme make a note on that... Thanks for #brainstorm :-D

+3

@lpang Multibyte characters would get this done, but not one. cout << int('a')<< endl; 97 cout << 'aa'<< endl; 24929 cout << 97*256 + 97 << endl; 24929 You can infer from this that the limit is 256 values, since the next byte up is * 256 to record it properly.

+3

(quietly) Multibyte character sets (MBCS, UTF-8, 16, 32 ...) can be longer than 2 bytes per represented char. gcc on SoloLearn (64-bit server I think) is configured in 32-bit mode. cout << 'abcd'; // is valid here, 32 bits Each 'slot' is a power of 256, in the same fashion as all bases: a*256³ + b*256² + c*256¹ + d (*256° or 1) [sharp turn ahead] You can use the same formula for IP addresses: 127.0.0.1 = 127*256³ + 0 + 0 + 1 = 2130706433 $ ping 2130706433 PING 2130706433 (127.0.0.1) 56(84) bytes of data. 64 bytes from 127.0.0.1: icmp_seq=1 ttl=64 time=0.194 ms 64 bytes from 127.0.0.1: icmp_seq=2 ttl=64 time=0.227 ms

+2

Oh damn, I meant 0x30. You are right, I'll edit it! That's what you get for not double checking :P

+2

@Clark, char to int is implicit conversion (also called widening) there is no need to do anything special. The compiler will give you a warning if you are doing signed/unsigned comparision. int to char is called narrowing conversion and this is what causes various issues, loss of data and precision loss(double to float)

+2

one last thing :) Usually the best way to deal with ascii and wchar_t variants of library functions is to use _t (<tchar.h> variants of library functions. You will be required to wrap your strings in a macro like _tchar const awstr [] = _T("my string"); _tprintf("%s", awstr); Now if Unicode is enabled then the following translations are performed during preprocessing stage _T("my string") ==> L"my string" else ==> "my string" _tchar ==> wchar_t else ==> char _tprintf ==> wprintf else ==> printf Thanks :)

+2

thank you for your shared ideas!

+1

You can convert a single digit like this: char one_char = '1'; int one_int = one_char - 0x30; However it is better to use C's atoi function that converts char*'s to ints. #include <stdlib.h> char* string = "1234"; int number = atoi(string);

+1

@Shindlabua, isn't it rather '1' - 48 as 48 == '0' ?

+1

How about converting char to int?

+1

char c = '5', * s = "1234"; Getting the ascii value of a char : (int)c; Getting the value of the figure : (int)(c - '0'); Translating a string into an integer : atoi(s);

+1

@lpang, sizeof (char) is target platform/programming language dependant. In most programming languages the sizeof (char) is 1 Byte, but in Java sizeof(char) is always 2 Bytes. And i think programming languages that compiles to byte code (virtual machines) have sizeof(char) == 2 Bytes but can be different. It is a responsibility of virtual machine to manage sizeof various datatypes when the byte code is ultimately translated to native code of the target platform.

+1

@lpang, Yeah my mistake ;-) but i am glad if you learned something new. In C you can use wchar_t which is two bytes

+1

@lpang, Sorry for the incomplete information about wchar_t if you use wchar_t then you have to use wide character variant of C library functions i.e instead of strlen use wcslen instead of printf use wprintf

+1

LOL :) Sorry for overdose !