
What are the properties of neural networks at initialization? Are there better way to initialize them for a given activation function? What happens when the number of neurons per layer grows to infinity?
Impressum | Datenschutzerklärung - WueCampus | Erklärung zur Barrierefreiheit | Bildnachweise