I forgot what logistic regression is 😅 but it might be related to backpropogation which is the process of learning for a neural network.
The activation function I’ve been talking about is the sigmoid function and the sigmoid function is the most common activation function. There are other activation function for specific purposes but I don’t know of any specific ones.
For further understanding I recommend the book (technically a website) at www.neuralnetworksanddeeplearning.com
as it introduces the reader to neural networks and assumes no prior knowledge of neural networks and needs no advanced math (that of course depends on your definition of advanced math. Familiarity with matrices is recommended but not required. If you want, think of them as arrays or sometimes arrays of arrays, aka 2D arrays, and the arithmetic functions including the dot product as functions you apply to these arrays that return either a number or another array). If you have trouble with something, for example some mathematical symbol you don’t recognize, you should google it instead of throwing up your hands and saying you have no understanding of anything. As surprising as it may be, you actually aren’t expected to know anything and everything and its totally normal to not understand something or to have questions. Just make sure to ask google things like “what is this weird Z symbol in math” (the symbol I’m mensioning here is the capital sigma, the symbol for taking a sum of something, such as all elements in a matrix. I recommend the website “Math is Fun” for help with all things math related, such as the summation function).