RuntimeError: Expected all tensors to be on the same device

I am getting the following error:

RuntimeError: Expected all tensors to be on the same device

However, both my tensor are using .to(device=t.device).

            self.indices_buf = torch.LongTensor().to(device=t.device)
            self.beams_buf = torch.LongTensor().to(device=t.device)
            self.beams_buf_float = torch.FloatTensor().to(device=t.device)

Here self.beams_buf_float.type(torch.LongTensor) gives Expected all tensors to be on the same device error.

        torch.div(self.indices_buf, vocab_size, out=self.beams_buf_float)
        self.beams_buf = self.beams_buf_float.type(torch.LongTensor)

I am confused here, as all of them are using device=t.device.


When calling self.beams_buf_float.type(torch.LongTensor), the resulting tensor device is set to the default one (i.e. cpu).

The correct way to cast your tensor to a new type while maintaining the original device is by calling self.brams_buf_float.to(torch.long) or self.brams_buf_float.long()