01What is the primary role of an 'Activation Function' like ReLU?
02What is the 'Backward Pass' (Backpropagation)?
03How does a 'Convolutional Layer' differ from a 'Dense Layer'?
04What is the purpose of 'Batch Normalization'?
05Which architecture is known for having a 'Memory' of previous inputs, ideal for text or sequences?
06In the anatomy of a neuron, what is the 'Bias'?
07What is 'Dropout' used for in neural networks?
08What happens in a 'Pooling Layer'?
09In a GAN, what is the 'Generator' trying to do?
10Which activation function is most likely to be used for the output of a multi-class classifier (e.g., 10 types of animals)?