signxai.torch_signxai package

Subpackages

Submodules

signxai.torch_signxai.methods module

Refactored PyTorch explanation methods with a unified execution entry point. This module applies DRY principles to eliminate redundant wrapper functions.

signxai.torch_signxai.methods.register_method(name)[source]

Decorator to register a method implementation.

signxai.torch_signxai.methods.execute(model: Module, x: Tensor, parsed_method: Dict[str, Any], **kwargs) ndarray[source]

Executes the specified XAI method after parsing and normalization.

Parameters:
  • model – The PyTorch model.

  • x – The input tensor.

  • parsed_method – A dictionary from MethodParser.

  • **kwargs – Additional runtime keyword arguments.

Returns:

The explanation map as a numpy array.

signxai.torch_signxai.methods_family module

PyTorch integration for Method Family Architecture. This module provides an alternative entry point that uses the new family-based approach.

signxai.torch_signxai.methods_family.calculate_relevancemap_with_families(model, input_tensor, method: str, target_class=None, **kwargs)[source]

Calculate relevance map using the Method Family Architecture. Falls back to original wrappers if needed.

Parameters:
  • model – PyTorch model (without softmax)

  • input_tensor – Input tensor

  • method – The XAI method to use

  • target_class – Target class index (optional)

  • **kwargs – Additional method-specific parameters

Returns:

Relevance map as numpy array

signxai.torch_signxai.utils module

Module contents

signxai.torch_signxai.calculate_relevancemap(model, input_tensor, method: str, target_class=None, **kwargs)

Calculate relevance map using the Method Family Architecture. Falls back to original wrappers if needed.

Parameters:
  • model – PyTorch model (without softmax)

  • input_tensor – Input tensor

  • method – The XAI method to use

  • target_class – Target class index (optional)

  • **kwargs – Additional method-specific parameters

Returns:

Relevance map as numpy array

signxai.torch_signxai.remove_softmax(model: Module) Module[source]

Removes the softmax layer from a PyTorch model if it’s the last one in model.classifier or a common sequential structure. This version modifies the model in-place if a Softmax layer is found. For models like the custom VGG16_PyTorch that don’t have an explicit Softmax layer but output logits, this function will effectively be a no-op regarding layer removal, which is correct.

signxai.torch_signxai.decode_predictions(preds: Tensor, top: int = 5, class_list_path: str | None = None) List[List[Tuple[str, str, float]]][source]

Decodes the prediction of an ImageNet model.

class signxai.torch_signxai.NoSoftmaxWrapper(model: Module)[source]

Bases: Module

Wrapper class that removes softmax from a PyTorch model.

__init__(model: Module)[source]

Initialize internal Module state, shared by both nn.Module and ScriptModule.

forward(x: Tensor) Tensor[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

signxai.torch_signxai.integrated_gradients(*args, **kwargs)[source]
signxai.torch_signxai.grad_cam(*args, **kwargs)[source]