日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問(wèn) 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 编程资源 > 编程问答 >内容正文

编程问答

AlexNet代码解读

發(fā)布時(shí)間:2025/3/15 编程问答 36 豆豆
生活随笔 收集整理的這篇文章主要介紹了 AlexNet代码解读 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

AlexNet代碼解讀

目錄

    • AlexNet代碼解讀
  • 概述
  • 網(wǎng)絡(luò)結(jié)構(gòu)圖
  • AlexNet代碼細(xì)節(jié)分析

概述

AlexNet的網(wǎng)絡(luò)結(jié)構(gòu)很簡(jiǎn)單,是最初級(jí)版本的CNN,沒有使用什么技巧。
網(wǎng)絡(luò)分成兩個(gè)部分,分別是卷積、激活、池化構(gòu)成的特征提取器,以及前向神經(jīng)網(wǎng)絡(luò)的分類器。

網(wǎng)絡(luò)結(jié)構(gòu)圖

AlexNet代碼細(xì)節(jié)分析

import numpy as np import torch import torch.nn as nn from typing import Any from torchsummary import summary __all__ = ['AlexNet','alexnet']mpdel_urls = {'alexnet': 'https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth', } class AlexNet(nn.Module): class AlexNet(nn.Module):def __init__(self, num_classes: int = 1000) -> None:super(AlexNet, self).__init__()# conv、relu、maxpool串聯(lián)結(jié)構(gòu)來(lái)提取特征self.features = nn.Sequential(nn.Conv2d(3, 64, kernel_size=11, stride=4, padding=2),nn.ReLU(inplace=True),nn.MaxPool2d(kernel_size=3, stride=2),nn.Conv2d(64, 192, kernel_size=5, padding=2),nn.ReLU(inplace=True),nn.MaxPool2d(kernel_size=3, stride=2),nn.Conv2d(192, 384, kernel_size=3, padding=1),nn.ReLU(inplace=True),nn.Conv2d(384, 256, kernel_size=3, padding=1),nn.ReLU(inplace=True),nn.Conv2d(256, 256, kernel_size=3, padding=1),nn.ReLU(inplace=True),nn.MaxPool2d(kernel_size=3, stride=2),)# 分類器(前面的卷積層已經(jīng)全部寫好,提取出特征了)# AlexNet的卷積層比較簡(jiǎn)單,層數(shù)不深,就直接寫在features函數(shù)里面了# 特征層操作:(卷積、激活、池化)*2、(卷積、激活)*2、卷積、激活、池化self.avgpool = nn.AdaptiveAvgPool2d(6)# 分類器操作:(dropout、全連接、激活)*2、全連接self.classifier = nn.Sequential(nn.Dropout(),nn.Linear(256 * 6 * 6, 4096),nn.ReLU(inplace=True),nn.Dropout(),nn.Linear(4096, 4096),nn.ReLU(inplace=True),nn.Linear(4096, num_classes),)def forward(self, x: torch.Tensor) -> torch.Tensor:x = self.features(x)x = self.avgpool(x)x = torch.flatten(x, 1)x = self.classifier(x)return x # 構(gòu)建alexnet def alexnet(pretrained:bool = False, progress:bool = True, **kwargs:Any)->AlexNet:model = AlexNet(**kwargs)if pretrained:state_dict = load_state_dict_from_url(model_urls['alexnet'],progress = progress)model.load_state_dict(state_dict)return model

現(xiàn)在輸入一個(gè)3x224x224的tensor,看它經(jīng)過(guò)alexnet每一層之后會(huì)變成怎樣大小的tensor。

from torchsummary import summary device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') t = AlexNet().to(device) summary(t,(3,224,224))

好了,輸出結(jié)果如下所示。

----------------------------------------------------------------Layer (type) Output Shape Param # ================================================================Conv2d-1 [-1, 64, 55, 55] 23,296ReLU-2 [-1, 64, 55, 55] 0MaxPool2d-3 [-1, 64, 27, 27] 0Conv2d-4 [-1, 192, 27, 27] 307,392ReLU-5 [-1, 192, 27, 27] 0MaxPool2d-6 [-1, 192, 13, 13] 0Conv2d-7 [-1, 384, 13, 13] 663,936ReLU-8 [-1, 384, 13, 13] 0Conv2d-9 [-1, 256, 13, 13] 884,992ReLU-10 [-1, 256, 13, 13] 0Conv2d-11 [-1, 256, 13, 13] 590,080ReLU-12 [-1, 256, 13, 13] 0MaxPool2d-13 [-1, 256, 6, 6] 0 AdaptiveAvgPool2d-14 [-1, 256, 6, 6] 0Dropout-15 [-1, 9216] 0Linear-16 [-1, 4096] 37,752,832752,832 0ReLU-17 [-1, 4096] 00 781,312Dropout-18 [-1, 4096] 00 097,000Linear-19 [-1, 4096] 16,========781,312ReLU-20 [-1, 4096]0Linear-21 [-1, 1000] 4,097,000 ================================================================ Total params: 61,100,840 Trainable params: 61,100,840 Non-trainable params: 0 ---------------------------------------------------------------- Input size (MB): 0.57 Forward/backward pass size (MB): 8.38 Params size (MB): 233.08 Estimated Total Size (MB): 242.03 ----------------------------------------------------------------

總結(jié)

以上是生活随笔為你收集整理的AlexNet代码解读的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。