如何获取保存为.pth文件的AI模型的层执行时间?

我正在尝试在CPU上运行一个类似Resnet的图像分类模型,并想知道运行模型中每一层的具体时间消耗情况。

我面临的问题是GitHub链接 https://github.com/facebookresearch/semi-supervised-ImageNet1K-models 上的模型是以.pth文件形式保存的。这个文件非常大(数百MB),除了它是二进制文件外,我不清楚它与PyTorch的具体区别。我使用以下脚本加载模型,但我不清楚如何修改模型或在模型层之间插入t = time.time()变量/语句来分解每一层的执行时间。

问题:

  1. 运行以下脚本是否能正确估计在CPU上运行模型的端到端时间(t2-t1),还是会包括PyTorch的编译时间?

  2. 如何在连续的层之间插入时间语句以获得时间分解?

  3. GitHub链接上没有推理/训练脚本,只有.pth文件。那么应该如何进行推理或训练?如何在.pth模型的连续层之间插入额外的层并保存它们?

#!/usr/bin/env pythonimport torch torchvision timemodel=torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext50_32x4d_swsl', force_reload=False)in = torch.randn(1, 3, 224, 224)t1 = time.time()out = model.forward(in)t2 = time.time()```**strong text**

回答:

实现这种需求的一个简单方法是为模型的每个模块注册前向钩子,这些钩子会更新一个全局变量来存储时间,并计算最后一次和当前计算之间的时间差。

例如:

import torchimport torchvisionimport timeglobal_time = Noneexec_times = []def store_time(self, input, output):    global global_time, exec_times    exec_times.append(time.time() - global_time)    global_time = time.time()model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext50_32x4d_swsl', force_reload=False)x = torch.randn(1, 3, 224, 224)# 为每个模块注册钩子以计算时间差for module in model.modules():    module.register_forward_hook(store_time)global_time = time.time()out = model(x)t2 = time.time()for module, t in zip(model.modules(), exec_times):    print(f"{module.__class__}: {t}")

我得到的输出是:

<class 'torchvision.models.resnet.ResNet'>: 0.004999876022338867<class 'torch.nn.modules.conv.Conv2d'>: 0.002006053924560547<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009946823120117188<class 'torch.nn.modules.activation.ReLU'>: 0.007998466491699219<class 'torch.nn.modules.pooling.MaxPool2d'>: 0.0010004043579101562<class 'torch.nn.modules.container.Sequential'>: 0.0020003318786621094<class 'torchvision.models.resnet.Bottleneck'>: 0.0010023117065429688<class 'torch.nn.modules.conv.Conv2d'>: 0.017997026443481445<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010018348693847656<class 'torch.nn.modules.conv.Conv2d'>: 0.0009999275207519531<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.003000497817993164<class 'torch.nn.modules.conv.Conv2d'>: 0.003999948501586914<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.001997232437133789<class 'torch.nn.modules.activation.ReLU'>: 0.004001140594482422<class 'torch.nn.modules.container.Sequential'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.001999378204345703<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torchvision.models.resnet.Bottleneck'>: 0.003001689910888672<class 'torch.nn.modules.conv.Conv2d'>: 0.0020008087158203125<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009992122650146484<class 'torch.nn.modules.conv.Conv2d'>: 0.0019991397857666016<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010001659393310547<class 'torch.nn.modules.conv.Conv2d'>: 0.0009999275207519531<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002998828887939453<class 'torch.nn.modules.activation.ReLU'>: 0.0010013580322265625<class 'torchvision.models.resnet.Bottleneck'>: 0.0029997825622558594<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002999544143676758<class 'torch.nn.modules.conv.Conv2d'>: 0.0010006427764892578<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.001001119613647461<class 'torch.nn.modules.conv.Conv2d'>: 0.0019979476928710938<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010018348693847656<class 'torch.nn.modules.activation.ReLU'>: 0.0010001659393310547<class 'torch.nn.modules.container.Sequential'>: 0.00299835205078125<class 'torchvision.models.resnet.Bottleneck'>: 0.002004384994506836<class 'torch.nn.modules.conv.Conv2d'>: 0.0009975433349609375<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.005999088287353516<class 'torch.nn.modules.conv.Conv2d'>: 0.0020003318786621094<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010001659393310547<class 'torch.nn.modules.activation.ReLU'>: 0.0020017623901367188<class 'torch.nn.modules.container.Sequential'>: 0.0009970664978027344<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0029997825622558594<class 'torchvision.models.resnet.Bottleneck'>: 0.0010008811950683594<class 'torch.nn.modules.conv.Conv2d'>: 0.00500035285949707<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009984970092773438<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020020008087158203<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019979476928710938<class 'torch.nn.modules.activation.ReLU'>: 0.0010018348693847656<class 'torchvision.models.resnet.Bottleneck'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.00099945068359375<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.001001119613647461<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002997875213623047<class 'torch.nn.modules.conv.Conv2d'>: 0.0010013580322265625<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002000570297241211<class 'torch.nn.modules.activation.ReLU'>: 0.0<class 'torchvision.models.resnet.Bottleneck'>: 0.001997232437133789<class 'torch.nn.modules.conv.Conv2d'>: 0.0010008811950683594<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.001001596450805664<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.00099945068359375<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002998828887939453<class 'torch.nn.modules.activation.ReLU'>: 0.0010020732879638672<class 'torch.nn.modules.container.Sequential'>: 0.0010020732879638672<class 'torchvision.models.resnet.Bottleneck'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.001995563507080078<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002001523971557617<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010001659393310547<class 'torch.nn.modules.conv.Conv2d'>: 0.0010008811950683594<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.activation.ReLU'>: 0.0029985904693603516<class 'torch.nn.modules.container.Sequential'>: 0.0009989738464355469<class 'torch.nn.modules.conv.Conv2d'>: 0.0010068416595458984<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torchvision.models.resnet.Bottleneck'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.004993438720703125<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010013580322265625<class 'torch.nn.modules.conv.Conv2d'>: 0.0010001659393310547<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010018348693847656<class 'torch.nn.modules.conv.Conv2d'>: 0.001997709274291992<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.activation.ReLU'>: 0.0019991397857666016<class 'torchvision.models.resnet.Bottleneck'>: 0.0029990673065185547<class 'torch.nn.modules.conv.Conv2d'>: 0.0030128955841064453<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019872188568115234<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0029993057250976562<class 'torch.nn.modules.activation.ReLU'>: 0.0010008811950683594<class 'torchvision.models.resnet.Bottleneck'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0010006427764892578<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009992122650146484<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.003001689910888672<class 'torch.nn.modules.conv.Conv2d'>: 0.0019986629486083984<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010008811950683594<class 'torch.nn.modules.activation.ReLU'>: 0.0<class 'torchvision.models.resnet.Bottleneck'>: 0.002000093460083008<class 'torch.nn.modules.conv.Conv2d'>: 0.0019986629486083984<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020012855529785156<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019981861114501953<class 'torch.nn.modules.activation.ReLU'>: 0.0030014514923095703<class 'torchvision.models.resnet.Bottleneck'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0029985904693603516<class 'torch.nn.modules.conv.Conv2d'>: 0.0010008811950683594<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0010013580322265625<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009989738464355469<class 'torch.nn.modules.activation.ReLU'>: 0.0<class 'torch.nn.modules.container.Sequential'>: 0.002998828887939453<class 'torchvision.models.resnet.Bottleneck'>: 0.002000570297241211<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.003000497817993164<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020020008087158203<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009982585906982422<class 'torch.nn.modules.activation.ReLU'>: 0.0009996891021728516<class 'torch.nn.modules.container.Sequential'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0029990673065185547<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020003318786621094<class 'torchvision.models.resnet.Bottleneck'>: 0.0010025501251220703<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019981861114501953<class 'torch.nn.modules.conv.Conv2d'>: 0.0019996166229248047<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019996166229248047<class 'torch.nn.modules.activation.ReLU'>: 0.0<class 'torchvision.models.resnet.Bottleneck'>: 0.0030002593994140625<class 'torch.nn.modules.conv.Conv2d'>: 0.0020012855529785156<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.0<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0<class 'torch.nn.modules.conv.Conv2d'>: 0.006000518798828125<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019979476928710938<class 'torch.nn.modules.activation.ReLU'>: 0.0<class 'torch.nn.modules.pooling.AdaptiveAvgPool2d'>: 0.002003192901611328<class 'torch.nn.modules.linear.Linear'>: 0.0019965171813964844Process finished with exit code 0

Related Posts

使用LSTM在Python中预测未来值

这段代码可以预测指定股票的当前日期之前的值,但不能预测…

如何在gensim的word2vec模型中查找双词组的相似性

我有一个word2vec模型,假设我使用的是googl…

dask_xgboost.predict 可以工作但无法显示 – 数据必须是一维的

我试图使用 XGBoost 创建模型。 看起来我成功地…

ML Tuning – Cross Validation in Spark

我在https://spark.apache.org/…

如何在React JS中使用fetch从REST API获取预测

我正在开发一个应用程序,其中Flask REST AP…

如何分析ML.NET中多类分类预测得分数组?

我在ML.NET中创建了一个多类分类项目。该项目可以对…

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注