日韩av黄I国产麻豆传媒I国产91av视频在线观看I日韩一区二区三区在线看I美女国产在线I麻豆视频国产在线观看I成人黄色短片

歡迎訪問(wèn) 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) >

batchnorm pytorch_GitHub趋势榜第一:TensorFlow+PyTorch深度学习资源大汇总

發(fā)布時(shí)間:2023/12/19 44 豆豆
生活随笔 收集整理的這篇文章主要介紹了 batchnorm pytorch_GitHub趋势榜第一:TensorFlow+PyTorch深度学习资源大汇总 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

【新智元導(dǎo)讀】該項(xiàng)目是Jupyter Notebook中TensorFlow和PyTorch的各種深度學(xué)習(xí)架構(gòu),模型和技巧的集合。內(nèi)容非常豐富,適用于Python 3.7,適合當(dāng)做工具書。

本文搜集整理了Jupyter Notebook中TensorFlow和PyTorch的各種深度學(xué)習(xí)架構(gòu),模型和技巧,內(nèi)容非常豐富,適用于Python 3.7,適合當(dāng)做工具書。

大家可以將內(nèi)容按照需要進(jìn)行分割,打印出來(lái),或者做成電子書等,隨時(shí)查閱。

傳統(tǒng)機(jī)器學(xué)習(xí)

感知器

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/basic-ml/perceptron.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/basic-ml/perceptron.ipynb

邏輯回歸

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/basic-ml/logistic-regression.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/basic-ml/logistic-regression.ipynb

Softmax Regression (Multinomial Logistic Regression)

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/basic-ml/softmax-regression.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/basic-ml/softmax-regression.ipynb

多層感知器

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mlp/mlp-basic.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mlp/mlp-basic.ipynb

具有Dropout多層感知器

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mlp/mlp-dropout.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mlp/mlp-dropout.ipynb

具有批量歸一化的多層感知器

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mlp/mlp-batchnorm.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mlp/mlp-batchnorm.ipynb

具有反向傳播的多層感知器

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mlp/mlp-lowlevel.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mlp/mlp-fromscratch__sigmoid-mse.ipynb

CNN

基礎(chǔ)

CNN

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/cnn/convnet.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-basic.ipynb

具有He初始化的CNN

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-he-init.ipynb

概念

用等效卷積層代替完全連接

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/fc-to-conv.ipynb

全卷積

全卷積神經(jīng)網(wǎng)絡(luò)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-allconv.ipynb

AlexNet

AlexNet on CIFAR-10

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-alexnet-cifar10.ipynb

VGG

CNN VGG-16

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/cnn/cnn-vgg16.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-vgg16.ipynb

VGG-16 Gender Classifier Trained on CelebA

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-vgg16-celeba.ipynb

CNN VGG-19

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-vgg19.ipynb

ResNet

ResNet and Residual Blocks

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/resnet-ex-1.ipynb

ResNet-18 Digit Classifier Trained on MNIST

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet18-mnist.ipynb

ResNet-18 Gender Classifier Trained on CelebA

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet18-celeba-dataparallel.ipynb

ResNet-34 Digit Classifier Trained on MNIST

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet34-mnist.ipynb

ResNet-34 Gender Classifier Trained on CelebA

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet34-celeba-dataparallel.ipynb

ResNet-50 Digit Classifier Trained on MNIST

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet50-mnist.ipynb

ResNet-50 Gender Classifier Trained on CelebA

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet50-celeba-dataparallel.ipynb

ResNet-101 Gender Classifier Trained on CelebA

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet101-celeba.ipynb

ResNet-152 Gender Classifier Trained on CelebA

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet152-celeba.ipynb

Network in Network

Network in Network CIFAR-10 Classifier

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/nin-cifar10.ipynb

度量學(xué)習(xí)

具有多層感知器的孿生網(wǎng)絡(luò)

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/metric/siamese-1.ipynb

自動(dòng)編碼機(jī)

全連接自動(dòng)編碼機(jī)

自動(dòng)編碼機(jī)

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/autoencoder/autoencoder.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-basic.ipynb

具有解卷積/轉(zhuǎn)置卷積的卷積自動(dòng)編碼機(jī)

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/autoencoder/ae-deconv.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-deconv.ipynb

具有解卷積的卷積自動(dòng)編碼機(jī)(無(wú)池化操作)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/aer-deconv-nopool.ipynb

具有最近鄰插值的卷積自動(dòng)編碼機(jī)

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/autoencoder/autoencoder-conv-nneighbor.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-conv-nneighbor.ipynb

具有最近鄰插值的卷積自動(dòng)編碼機(jī) - 在CelebA上進(jìn)行訓(xùn)練

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-conv-nneighbor-celeba.ipynb

具有最近鄰插值的卷積自動(dòng)編碼機(jī) - 在Quickdraw上訓(xùn)練

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-conv-nneighbor-quickdraw-1.ipynb

變分自動(dòng)編碼機(jī)

變分自動(dòng)編碼機(jī)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-var.ipynb

卷積變分自動(dòng)編碼機(jī)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-conv-var.ipynb

條件變分自動(dòng)編碼機(jī)

條件變分自動(dòng)編碼機(jī)(重建丟失中帶標(biāo)簽)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-cvae.ipynb

條件變分自動(dòng)編碼機(jī)(重建損失中沒(méi)有標(biāo)簽)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-cvae_no-out-concat.ipynb

卷積條件變分自動(dòng)編碼機(jī)(重建丟失中帶標(biāo)簽)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-cnn-cvae.ipynb

卷積條件變分自動(dòng)編碼機(jī)(重建損失中沒(méi)有標(biāo)簽)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/autoencoder/ae-cnn-cvae_no-out-concat.ipynb

GAN

MNIST上完全連接的GAN

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/gan/gan.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/gan/gan.ipynb

MNIST上的卷積GAN

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/gan/gan-conv.ipynb

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/gan/gan-conv.ipynb

具有標(biāo)簽平滑的MNIST上的卷積GAN

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/gan/gan-conv-smoothing.ipynb

RNN

Many-to-one: Sentiment Analysis / Classification

A simple single-layer RNN (IMDB)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/rnn_simple_imdb.ipynb

A simple single-layer RNN with packed sequences to ignore padding characters (IMDB)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/rnn_simple_packed_imdb.ipynb

RNN with LSTM cells (IMDB)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/rnn_lstm_packed_imdb.ipynb

RNN with LSTM cells and Own Dataset in CSV Format (IMDB)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/rnn_lstm_packed_own_csv_imdb.ipynb

RNN with GRU cells (IMDB)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/rnn_gru_packed_imdb.ipynb

Multilayer bi-directional RNN (IMDB)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/rnn_gru_packed_imdb.ipynb

Many-to-Many / Sequence-to-Sequence

A simple character RNN to generate new text (Charles Dickens)

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/char_rnn-charlesdickens.ipynb

序數(shù)回歸

Ordinal Regression CNN -CORAL w. ResNet34 on AFAD-Lite

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/ordinal/ordinal-cnn-coral-afadlite.ipynb

Ordinal Regression CNN -Niu et al. 2016 w. ResNet34 on AFAD-Lite

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/ordinal/ordinal-cnn-niu-afadlite.ipynb

Ordinal Regression CNN -Beckham and Pal 2016 w. ResNet34 on AFAD-Lite

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/ordinal/ordinal-cnn-niu-afadlite.ipynb

技巧和竅門

Cyclical Learning Rate

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/tricks/cyclical-learning-rate.ipynb

PyTorch工作流程和機(jī)制

自定義數(shù)據(jù)集

使用PyTorch數(shù)據(jù)集加載實(shí)用程序用于自定義數(shù)據(jù)集-CSV文件轉(zhuǎn)換為HDF5

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mechanics/custom-data-loader-csv.ipynb

使用PyTorch數(shù)據(jù)集加載自定義數(shù)據(jù)集的實(shí)用程序 - 來(lái)自CelebA的圖像

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mechanics/custom-data-loader-celeba.ipynb

使用PyTorch數(shù)據(jù)集加載自定義數(shù)據(jù)集的實(shí)用程序 - 從Quickdraw中提取

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mechanics/custom-data-loader-quickdraw.ipynb

使用PyTorch數(shù)據(jù)集加載實(shí)用程序用于自定義數(shù)據(jù)集 - 從街景房號(hào)(SVHN)數(shù)據(jù)集中繪制

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/custom-data-loader-svhn.ipynb

訓(xùn)練和預(yù)處理

帶固定內(nèi)存的數(shù)據(jù)加載

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-resnet34-cifar10-pinmem.ipynb

標(biāo)準(zhǔn)化圖像

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-standardized.ipynb

圖像轉(zhuǎn)換示例

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mechanics/torchvision-transform-examples.ipynb

Char-RNN with Own Text File

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/char_rnn-charlesdickens.ipynb

Sentiment Classification RNN with Own CSV File

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/rnn/rnn_lstm_packed_own_csv_imdb.ipynb

并行計(jì)算

在CelebA上使用具有DataParallel -VGG-16性別分類器的多個(gè)GPU

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/cnn/cnn-vgg16-celeba-data-parallel.ipynb

其它

Sequential API and hooks

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mlp/mlp-sequential.ipynb

圖層內(nèi)的權(quán)重共享

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mechanics/cnn-weight-sharing.ipynb

僅使用Matplotlib在Jupyter Notebook中繪制實(shí)時(shí)訓(xùn)練性能

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mlp/plot-jupyter-matplotlib.ipynb

Autograd

在PyTorch中獲取中間變量的漸變

PyTorch:

https://github.com/rasbt/deeplearning-models/blob/master/pytorch_ipynb/mechanics/manual-gradients.ipynb

TensorFlow工作流及機(jī)制

自定義數(shù)據(jù)集

使用NumPy NPZ Archives為Minibatch訓(xùn)練添加圖像數(shù)據(jù)集

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mechanics/image-data-chunking-npz.ipynb

使用HDF5存儲(chǔ)用于Minibatch培訓(xùn)的圖像數(shù)據(jù)集

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mechanics/image-data-chunking-hdf5.ipynb

使用輸入Pipeline從TFRecords文件中讀取數(shù)據(jù)

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mechanics/tfrecords.ipynb

使用隊(duì)列運(yùn)行器直接從磁盤提供圖像

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mechanics/file-queues.ipynb

使用TensorFlow的Dataset API

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mechanics/dataset-api.ipynb

訓(xùn)練和預(yù)處理

保存和加載訓(xùn)練模型 - 來(lái)自TensorFlow Checkpoint文件和NumPy NPZ Archives

TensorFlow 1:

https://github.com/rasbt/deeplearning-models/blob/master/tensorflow1_ipynb/mechanics/saving-and-reloading-models.ipynb

參考鏈接:

https://github.com/rasbt/deeplearning-models

總結(jié)

以上是生活随笔為你收集整理的batchnorm pytorch_GitHub趋势榜第一:TensorFlow+PyTorch深度学习资源大汇总的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。