오래 끌어왔던 Grokking Deep Learning 책이 출간되어 드디어 책상에 도착했습니다. 이 책은 넘파이로 밑바닥부터 신경망을 구현하면서 배우도록 구성되어 있습니다. 전체가 300 페이지로 두껍지 않고 훌륭한 일러스트가 돋보이네요. 원래 계획했던 강화학습 부분이 빠져서 아쉽습니다. 저자가 NLP 연구자라서 그런지 합성곱 신경망 부분은 간략히 다루어져 있습니다. 대신 순환 신경망과 LSTM에 충분한 지면을 할애하고 있습니다. 목차는 아래를 참고하세요. 🙂
- Introducing deep learning: why you should learn it
- Fundamental concepts: how do machines learn?
- Introduction to neural prediction: forward propagation
- Introduction to neural learning: gradient descent
- Learning multiple weights at a time: generalizing gradient descent
- Building your first deep neural network: introduction to backpropagation
- How to picture neural networks: in your head and on paper
- Learning signal and ignoring noise: introduction to regularization and batching
- Modeling probabilities and nonlinearities: activation functions
- Neural learning about edges and corners: intro to convolutional neural networks
- Neural networks that understand language: king – man + woman == ?
- Neural networks that write like Shakespeare: recurrent layers for variable-length data
- Introducing automatic optimization: let’s build a deep learning framework
- Learning to write like Shakespeare: long short-term memory
- Deep learning on unseen data: introducing federated learning
- Where to go from here: a brief guide
핑백: [텐서 플로우 블로그 (Tensor ≈ Blog)] Grokking Deep Learning - DEVBLOG - 개발자 메타블로그