Skip to content

Commit e089ba4

Browse files
authored
Enable stricter type checking for torch/fx (#19156)
Summary: Add torch/fx/** sub-config to pyrefly.toml enabling implicit-any, unannotated-return, and unannotated-parameter checks. Add tabulate to ignore-missing-imports since it is not a strict requirement Authored with Claude. cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx ipiszy kadeng muchulee8 amjames chauhang aakhundov coconutruben jataylo X-link: pytorch/pytorch#180625 Differential Revision: D102619388 Pulled By: Lucaskabela
1 parent f3e49ff commit e089ba4

1 file changed

Lines changed: 2 additions & 0 deletions

File tree

backends/cadence/aot/fuse_ops.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,7 @@
5555

5656
def get_tensor_arg(node: torch.fx.Node, arg_name: str) -> torch.Tensor:
5757
graph_module = node.graph.owning_module
58+
assert graph_module is not None
5859
tensor = get_tensor_from_attr(graph_module, get_arg(node, arg_name, torch.fx.Node))
5960
assert isinstance(tensor, torch.Tensor), f"{arg_name} must be present"
6061
return tensor
@@ -264,6 +265,7 @@ def _extract_conv_params(
264265
conv_weight = get_tensor_arg(conv_node, "weight")
265266
# conv_bias is truly optional - fusion function handles None
266267
graph_module = conv_node.graph.owning_module
268+
assert graph_module is not None
267269
conv_bias = get_tensor_from_attr(
268270
graph_module, cast(Optional[torch.fx.Node], get_arg(conv_node, "bias"))
269271
)

0 commit comments

Comments
 (0)