![Nora Belrose on X: "the @huggingface implementation of swin transformer v2 outputs NaN at initialization when you change the image size or number of channels from the default https://t.co/AXMahI2ptl" / X Nora Belrose on X: "the @huggingface implementation of swin transformer v2 outputs NaN at initialization when you change the image size or number of channels from the default https://t.co/AXMahI2ptl" / X](https://pbs.twimg.com/media/GEk1gYRWsAACFys.png)
Nora Belrose on X: "the @huggingface implementation of swin transformer v2 outputs NaN at initialization when you change the image size or number of channels from the default https://t.co/AXMahI2ptl" / X
![TorchServe: Increasing inference speed while improving efficiency - deployment - PyTorch Dev Discussions TorchServe: Increasing inference speed while improving efficiency - deployment - PyTorch Dev Discussions](https://global.discourse-cdn.com/standard10/uploads/pytorch1/original/2X/0/0c2ce27b800a356c166df89b66fc26702ad45faf.png)
TorchServe: Increasing inference speed while improving efficiency - deployment - PyTorch Dev Discussions
![TorchServe: Increasing inference speed while improving efficiency - deployment - PyTorch Dev Discussions TorchServe: Increasing inference speed while improving efficiency - deployment - PyTorch Dev Discussions](https://global.discourse-cdn.com/standard10/uploads/pytorch1/original/2X/2/209c033d4dfe32debf73a6d462c5537c87976137.png)
TorchServe: Increasing inference speed while improving efficiency - deployment - PyTorch Dev Discussions
![01_PyTorch Workflow - 42. timestamp 5:35:00 - Problem with plot_prediction · mrdbourke pytorch-deep-learning · Discussion #341 · GitHub 01_PyTorch Workflow - 42. timestamp 5:35:00 - Problem with plot_prediction · mrdbourke pytorch-deep-learning · Discussion #341 · GitHub](https://user-images.githubusercontent.com/37710072/224113755-ec7418c6-fa8e-4b9c-8ebe-48e3705e7e50.png)
01_PyTorch Workflow - 42. timestamp 5:35:00 - Problem with plot_prediction · mrdbourke pytorch-deep-learning · Discussion #341 · GitHub
![Performance of `torch.compile` is significantly slowed down under `torch.inference_mode` - torch.compile - PyTorch Forums Performance of `torch.compile` is significantly slowed down under `torch.inference_mode` - torch.compile - PyTorch Forums](https://discuss.pytorch.org/uploads/default/original/3X/d/6/d65819241a215e5606721d6179a38d960e0ef159.png)
Performance of `torch.compile` is significantly slowed down under `torch.inference_mode` - torch.compile - PyTorch Forums
torch.inference_mode and tensor subclass: RuntimeError: Cannot set version_counter for inference tensor · Issue #112024 · pytorch/pytorch · GitHub
Inference mode complains about inplace at torch.mean call, but I don't use inplace · Issue #70177 · pytorch/pytorch · GitHub
![PyTorch on X: "4. ⚠️ Inference tensors can't be used outside InferenceMode for Autograd operations. ⚠️ Inference tensors can't be modified in-place outside InferenceMode. ✓ Simply clone the inference tensor and you're PyTorch on X: "4. ⚠️ Inference tensors can't be used outside InferenceMode for Autograd operations. ⚠️ Inference tensors can't be modified in-place outside InferenceMode. ✓ Simply clone the inference tensor and you're](https://pbs.twimg.com/media/E_Q4bkJXMAcTBXF.jpg)