d2l-pytorch
This project is adapted from the original Dive Into Deep Learning book by Aston Zhang, Zachary C. Lipton, Mu Li, Alex J. Smola and all the community contributors. We have made an effort to modify the book and convert the MXnet code snippets into PyTorch.
Note: Some ipynb notebooks may not be rendered perfectly in Github. We suggest cloning
the repo or using nbviewer to view the notebooks.
Ch02 Installation
Ch03 Introduction
Ch04 The Preliminaries: A Crashcourse
4.2 Linear Algebra
4.6 Documentation
Ch05 Linear Neural Networks
Ch06 Multilayer Perceptrons
Ch07 Deep Learning Computation
7.4 Custom Layers
7.5 File I/O
7.6 GPUs
Ch08 Convolutional Neural Networks
8.5 Pooling
Ch09 Modern Convolutional Networks
Ch10 Recurrent Neural Networks
10.1 Sequence Models
10.2 Language Models
10.4 Text Preprocessing
10.5 Implementation of Recurrent Neural Networks from Scratch
10.11 Bidirectional Recurrent Neural Networks
10.14 Sequence to Sequence
10.15 Beam Search
Ch11 Attention Mechanism
11.1 Attention Mechanism
11.2 Sequence to Sequence with Attention Mechanism
11.3 Transformer
Ch12 Optimization Algorithms
12.2 Convexity
12.3 Gradient Descent
12.6 Momentum
12.7 Adagrad
12.8 RMSProp
12.9 Adadelta
12.10 Adam
Ch14 Computer Vision
14.1 Image Augmentation
14.2 Fine Tuning
14.4 Anchor Boxes
14.8 Region-based CNNs (R-CNNs)
14.9 Semantic Segmentation and Data Sets
14.10 Transposed Convolution
14.11 Fully Convolutional Networks (FCN)
14.12 Neural Style Transfer
14.13 Image Classification (CIFAR-10) on Kaggle
14.14 Dog Breed Identification (ImageNet Dogs) on Kaggle
Please feel free to open a Pull Request to contribute a notebook in PyTorch for the rest of the chapters. Before starting out with the notebook, open an issue with the name of the notebook in order to contribute for the same. We will assign that issue to you (if no one has been assigned earlier).
Strictly follow the naming conventions for the IPython Notebooks and the subsections.
Also, if you think there's any section that requires more/better explanation, please use the issue tracker to open an issue and let us know about the same. We'll get back as soon as possible.
Find some code that needs improvement and submit a pull request.
Find a reference that we missed and submit a pull request.
Try not to submit huge pull requests since this makes them hard to understand and incorporate. Better send several smaller ones.
If you like this repo and find it useful, please consider (★) starring it, so that it can reach a broader audience.
[1] Original Book Dive Into Deep Learning -> Github Repo
[2] Deep Learning - The Straight Dope
[3] PyTorch - MXNet Cheatsheet
If you use this work or code for your research please cite the original book with the following bibtex entry.
@book{zhang2019dive, title={Dive into Deep Learning}, author={Aston Zhang and Zachary C. Lipton and Mu Li and Alexander J. Smola}, note={url{http://www.d2l.ai}}, year={2019} }
还没有评论,说两句吧!
热门资源
Keras-ResNeXt
Keras ResNeXt Implementation of ResNeXt models...
seetafaceJNI
项目介绍 基于中科院seetaface2进行封装的JAVA...
spark-corenlp
This package wraps Stanford CoreNLP annotators ...
capsnet-with-caps...
CapsNet with capsule-wise convolution Project ...
inferno-boilerplate
This is a very basic boilerplate example for pe...
智能在线
400-630-6780
聆听.建议反馈
E-mail: support@tusaishared.com