How to remove the last FC layer from a ResNet model in PyTorch?

For ResNet model, you can use children attribute to access layers since ResNet model in pytorch consist of nn modules. (Tested on pytorch 0.4.1)

model = models.resnet152(pretrained=True)
newmodel = torch.nn.Sequential(*(list(model.children())[:-1]))
print(newmodel)

Update: Although there is not an universal answer for the question that can work on all pytorch models, it should work on all well structured ones. Existing layers you add to your model (such as torch.nn.Linear, torch.nn.Conv2d, torch.nn.BatchNorm2d...) all based on torch.nn.Module class. And if you implement a custom layer and add that to your network you should inherit it from pytorch's torch.nn.Module class. As written in documentation, children attribute lets you access the modules of your class/model/network.

def children(self):
        r"""Returns an iterator over immediate children modules.  

Update: It is important to note that children() returns "immediate" modules, which means if last module of your network is a sequential, it will return whole sequential.


You can do it simply by :

Model.fc = nn.Sequential()

or alternatively you can create Identity layer:

class Identity(nn.Module):
    def __init__(self):
        super().__init__()
        
    def forward(self, x):
        return x

and replace fc layer with it :

Model.fc = Identity()

If you are looking not just to strip the model of the last FC layer, but to replace it with your own, hence taking advantage of transfer learning technique, you can do so in this way:

import torch.nn as nn
from collections import OrderedDict

n_inputs = model.fc.in_features

# add more layers as required
classifier = nn.Sequential(OrderedDict([
    ('fc1', nn.Linear(n_inputs, 512))
]))

model.fc = classifier