Does punctuation in programming even makes sense? | Sololearn: Learn to code for FREE!
New course! Every coder should learn Generative AI!
Try a free lesson
0

Does punctuation in programming even makes sense?

Because to me it doesn't at all. I'm not saying it's bad, i just don't get it. It's just slapping different symbols around words without any kind of visible logic. Why do you need to end a certain type of statements with ";"? Why is subdivision marked with "%" and not ":" or "÷"? Capitalization here is plain illogical and random aswell(as for me). Who in the right mind would think that jammmingAllWordsInAphraseTogetherLikeThis and then making the first letter small and the others big on random is convenient? When I question it the only answer I receive is "so the computer understands it". Oh wow. When the computer understands it I don't understand it. What next? This isn't a hiss-fit, I am genuinely seeking for help

15th Mar 2022, 8:11 PM
Unexpected Explosions
Unexpected Explosions - avatar
8 Answers
+ 2
Assembly was better than machine code, but still quite abstract. We aimed to get something easier to read, which became programming languages like C. It is ordered, has syntax rules, requires the programmer to follow those rules, gives predictable results. Languages developed from C to be even easier for humans to read, like BASIC and Java. More language improvements gave us things like Python and SQL. The long and short is : Practice and a "syntax lookup cheat sheet" helps you to give the computer what it needs to make it do what you want. 🏠🔨
15th Mar 2022, 10:24 PM
HungryTradie
HungryTradie - avatar
+ 5
You can name variables whatever you want as long as it doesn't start with numbers or special characters. isCamelCase is a naming convention called camel case. It is not random, you can see that exactly the 1st letter is lower case and then the 1st letter of each word is capital. % is the modulo operator in many programming languages, it is convention. If it was a different character, would you hold forth about why it is not %?
15th Mar 2022, 8:21 PM
Lisa
Lisa - avatar
+ 4
Might I suggest that you start with something other than C# Lots of noob programmers start with Python, or with the trio of HTML, CSS, JavaScript. If I had any influence over a clever noob, I would most strongly recommend learning C because if you can do it there, you can easily do it with any other language!
15th Mar 2022, 10:29 PM
HungryTradie
HungryTradie - avatar
+ 3
One major source of difficulty when it comes to symbols is that they have to be completely unambiguous. Since the computer doesn't understand meaning it can't tell the difference between using a dot in a decimal number and using it to end a sentence. So you have to make a decision: do you want to use the dot for decimal numbers or to end statements. It's easier to use a different symbol to end statements, so many languages use semicolon for that purpose instead. On the other hand it would be harder to find a good alternative for a colon. So you can't use ":" for division. Different languages make different design choices, but there will always have to be compromises (at least until AI is so far that computers can "understand" meaning). On the positive side, at least most languages use the same symbols for basic arithmetic operators. So you don't have to learn new ones every time you switch languages.
16th Mar 2022, 12:29 AM
Simon Sauter
Simon Sauter - avatar
+ 2
Perhaps the thing you are missing in your journey to understanding is history. Computers started as very simple machines. An abacus could be called a computer. We moved on to mechanical machines, with gears and cams that could change the frequency of an increment. Eg one turn of that gear is 5 turns of this gear. We then got electricity and with it the ability to switch circuits on and off. It became more like Boolean logic: this on + that on = the other thing on. First on or second on = third off. We moved to electronics, doing the on/off with transistors and then silicon chips. The logic became much more complex. We used things like punch cards to store logic sequences (not sure when they became called programs). Logic became larger than punch cards could easily handle, so we came up with magnetic storage (like a floppy disc). We wrote the logic in machine code but the 0s and 1s are very unforgiving to human eyes. Assembly language used hexadecimal (16 digits) to replace the binary 1s and 0s.
15th Mar 2022, 10:19 PM
HungryTradie
HungryTradie - avatar
+ 2
Alright, thank you all for your replies, things got clearer now
15th Mar 2022, 11:52 PM
Unexpected Explosions
Unexpected Explosions - avatar
+ 2
Lisa if it would be different than those used in math then i guess i would because i'm not used to it. Anyway, thank you
15th Mar 2022, 11:55 PM
Unexpected Explosions
Unexpected Explosions - avatar
0
If you want to write in a language that you understand, but the computer doesn't that's fine. It's just not programming. You don't have to program.
15th Mar 2022, 8:22 PM
Simon Sauter
Simon Sauter - avatar