Skip to content

Dataclasses and post-processing refactor #2098

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
81 commits
Select commit Hold shift + click to select a range
68c5582
use dataclass for model in- and outputs
djdameln May 30, 2024
ddfcd5f
split dataclass in image and video
djdameln May 31, 2024
32e038d
use dataclass in torch inferencer
djdameln Jul 3, 2024
675dd3f
use dataclass in openvino inferencer
djdameln Jul 3, 2024
5779ab7
add post_processor class
djdameln Jul 22, 2024
0662558
remove default metrics from CLI
djdameln Jul 22, 2024
fddbeb1
export post processing
djdameln Jul 23, 2024
e32bd7d
add post processor to patchcore
djdameln Jul 23, 2024
90265e8
use named tuple for inference outputs
djdameln Jul 23, 2024
e3a9c1d
validate and format inputs of PredictBatch
djdameln Jul 24, 2024
89f972c
update torch inference
djdameln Jul 24, 2024
08bdae2
remove base inferencer inheritance
djdameln Jul 24, 2024
2bc76fc
update openvino inference
djdameln Jul 24, 2024
f7c7f9a
fix visualization
djdameln Jul 24, 2024
4160ab3
PredictBatch -> Batch
djdameln Jul 24, 2024
fd9eb24
post processor as callback
djdameln Jul 24, 2024
87facb6
use callback methods to apply post processing
djdameln Jul 25, 2024
2269a78
temporary fix for visualization
djdameln Jul 25, 2024
9652b9f
add DatasetItem class
djdameln Jul 25, 2024
082bbbc
fix pred_score shape and add __len__
djdameln Jul 26, 2024
dbabb20
make batch iterable
djdameln Jul 26, 2024
b190cd3
add in place replace method
djdameln Jul 26, 2024
ed904eb
use dataset items in inference
djdameln Jul 26, 2024
773e54a
dataset_items -> items
djdameln Jul 26, 2024
f8d999a
use namedtuple as torch model outputs
djdameln Jul 31, 2024
67046dd
merge main
djdameln Jul 31, 2024
d00b938
formatting
djdameln Aug 6, 2024
9fb4549
Merge branch 'main' into refactor_outputs
djdameln Aug 7, 2024
86cf632
Merge branch 'main' into refactor_outputs
djdameln Aug 15, 2024
fa3b874
split dataclasses into input/output and image/video
djdameln Aug 19, 2024
2761600
merge input and output classes
djdameln Aug 19, 2024
c650dfc
use init_subclass for attribute checking
djdameln Aug 20, 2024
ced34ca
add descriptor class for validation
djdameln Aug 20, 2024
12cd32d
improve error handling
djdameln Aug 20, 2024
b447cab
DataClassDescriptor -> FieldDescriptor
djdameln Aug 20, 2024
213c2b4
add is_optional method
djdameln Aug 20, 2024
fb80feb
add input validation for torch image and batch
djdameln Aug 21, 2024
d2337a7
use image and video dataclasses in library
djdameln Aug 21, 2024
b53f1f7
add more validation
djdameln Aug 23, 2024
5f16147
add validation
djdameln Aug 26, 2024
9203318
make postprocessor configurable from engine
djdameln Aug 27, 2024
e99d630
fix post processing logic
djdameln Aug 27, 2024
631ba97
Merge branch 'main' into refactor_outputs_separate
djdameln Aug 27, 2024
b750042
fix data tests
djdameln Aug 27, 2024
b37e265
remove detection task type
djdameln Aug 27, 2024
86a365d
fix more tests
djdameln Aug 27, 2024
fcbb628
use separate normalization stats for image and pixel preds
djdameln Aug 27, 2024
0fc3337
add sensitivity parameters to one class pp
djdameln Aug 27, 2024
7ec9dd7
fix utils tests
djdameln Aug 28, 2024
f5a48cd
fix utils tests
djdameln Aug 28, 2024
afaec9b
remove metric serialization test
djdameln Aug 28, 2024
e0a70c8
remove normalization and thresholding args
djdameln Aug 28, 2024
211d9f8
set default post processor in base model
djdameln Aug 28, 2024
eb584eb
remove manual threshold test
djdameln Aug 28, 2024
442c37f
fix remaining unit tests
djdameln Aug 28, 2024
bd59184
add post_processor to CLI args
djdameln Aug 28, 2024
e17eda5
remove old post processing callbacks
djdameln Aug 28, 2024
3140e8b
remove comment
djdameln Aug 28, 2024
af99bed
remove references to old normalization and thresholding callbacks
djdameln Aug 28, 2024
039be2a
remove reshape in openvino inferencer
djdameln Aug 29, 2024
381e638
export lightning model directly
djdameln Aug 29, 2024
daead5b
make collate accessible from dataset
djdameln Aug 29, 2024
987abe5
fix tools integration tests
djdameln Aug 29, 2024
a709c6c
add update method to dataclasses
djdameln Aug 29, 2024
a37fa3b
allow missing pred_score or anomaly_map in post processor
djdameln Aug 29, 2024
14da4fa
fix exportable centercrop conversion
djdameln Aug 30, 2024
beb3b97
fix model tests
djdameln Aug 30, 2024
a9d07db
test all models
djdameln Aug 30, 2024
6bcca36
fix efficient_ad
djdameln Aug 30, 2024
014cb59
post processor as model arg
djdameln Aug 30, 2024
58df063
disable rkde tests
djdameln Aug 30, 2024
25845fb
fix winclip export
djdameln Aug 30, 2024
1defdba
add copyright notice
djdameln Aug 30, 2024
8d60276
add validation for numpy anomaly map
djdameln Sep 2, 2024
0afb6d9
fix getting started notebook
djdameln Sep 2, 2024
a26efb9
remove hardcoded path
djdameln Sep 2, 2024
e7d9852
update dataset notebooks
djdameln Sep 2, 2024
a4bcbfe
update model notebooks
djdameln Sep 2, 2024
085c4aa
Merge branch 'feature/design-simplifications' into refactor_outputs
djdameln Sep 2, 2024
eff1f97
fix logging notebooks
djdameln Sep 2, 2024
40bb4be
fix model notebook
djdameln Sep 2, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
354 changes: 204 additions & 150 deletions notebooks/000_getting_started/001_getting_started.ipynb

Large diffs are not rendered by default.

364 changes: 43 additions & 321 deletions notebooks/100_datamodules/101_btech.ipynb

Large diffs are not rendered by default.

330 changes: 33 additions & 297 deletions notebooks/100_datamodules/102_mvtec.ipynb

Large diffs are not rendered by default.

512 changes: 36 additions & 476 deletions notebooks/100_datamodules/103_folder.ipynb

Large diffs are not rendered by default.

87 changes: 17 additions & 70 deletions notebooks/100_datamodules/104_tiling.ipynb

Large diffs are not rendered by default.

330 changes: 32 additions & 298 deletions notebooks/200_models/201_fastflow.ipynb

Large diffs are not rendered by default.

1,401 changes: 27 additions & 1,374 deletions notebooks/600_loggers/601_mlflow_logging.ipynb

Large diffs are not rendered by default.

1 change: 0 additions & 1 deletion src/anomalib/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,5 +20,4 @@ class TaskType(str, Enum):
"""Task type used when generating predictions on the dataset."""

CLASSIFICATION = "classification"
DETECTION = "detection"
SEGMENTATION = "segmentation"
20 changes: 9 additions & 11 deletions src/anomalib/callbacks/metrics.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
# SPDX-License-Identifier: Apache-2.0

import logging
from dataclasses import asdict
from enum import Enum
from typing import Any

Expand All @@ -12,6 +13,7 @@
from lightning.pytorch.utilities.types import STEP_OUTPUT

from anomalib import TaskType
from anomalib.dataclasses import Batch
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if this is the best place to store these objects

from anomalib.metrics import AnomalibMetricCollection, create_metric_collection
from anomalib.models import AnomalyModule

Expand Down Expand Up @@ -96,7 +98,6 @@ def setup(
pl_module.pixel_metrics.add_metrics(new_metrics[name])
else:
pl_module.pixel_metrics = create_metric_collection(pixel_metric_names, "pixel_")
self._set_threshold(pl_module)

def on_validation_epoch_start(
self,
Expand All @@ -120,7 +121,7 @@ def on_validation_batch_end(
del trainer, batch, batch_idx, dataloader_idx # Unused arguments.

if outputs is not None:
self._outputs_to_device(outputs)
outputs = self._outputs_to_device(outputs)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is for a future reference... I hope to get rid of this device related stuff, and leave it to Lightning

self._update_metrics(pl_module.image_metrics, pl_module.pixel_metrics, outputs)

def on_validation_epoch_end(
Expand All @@ -130,7 +131,6 @@ def on_validation_epoch_end(
) -> None:
del trainer # Unused argument.

self._set_threshold(pl_module)
self._log_metrics(pl_module)

def on_test_epoch_start(
Expand All @@ -155,7 +155,7 @@ def on_test_batch_end(
del trainer, batch, batch_idx, dataloader_idx # Unused arguments.

if outputs is not None:
self._outputs_to_device(outputs)
outputs = self._outputs_to_device(outputs)
self._update_metrics(pl_module.image_metrics, pl_module.pixel_metrics, outputs)

def on_test_epoch_end(
Expand All @@ -167,26 +167,24 @@ def on_test_epoch_end(

self._log_metrics(pl_module)

def _set_threshold(self, pl_module: AnomalyModule) -> None:
pl_module.image_metrics.set_threshold(pl_module.image_threshold.value.item())
pl_module.pixel_metrics.set_threshold(pl_module.pixel_threshold.value.item())

def _update_metrics(
self,
image_metric: AnomalibMetricCollection,
pixel_metric: AnomalibMetricCollection,
output: STEP_OUTPUT,
) -> None:
image_metric.to(self.device)
image_metric.update(output["pred_scores"], output["label"].int())
if "mask" in output and "anomaly_maps" in output:
image_metric.update(output.pred_score, output.gt_label.int())
if output.gt_mask is not None and output.anomaly_map is not None:
pixel_metric.to(self.device)
pixel_metric.update(torch.squeeze(output["anomaly_maps"]), torch.squeeze(output["mask"].int()))
pixel_metric.update(torch.squeeze(output.anomaly_map), torch.squeeze(output.gt_mask.int()))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
pixel_metric.update(torch.squeeze(output.anomaly_map), torch.squeeze(output.gt_mask.int()))
pixel_metric.update(output.anomaly_map.squeeze(), output.gt_mask.squeeze().int())


def _outputs_to_device(self, output: STEP_OUTPUT) -> STEP_OUTPUT | dict[str, Any]:
if isinstance(output, dict):
for key, value in output.items():
output[key] = self._outputs_to_device(value)
elif isinstance(output, Batch):
output = output.__class__(**self._outputs_to_device(asdict(output)))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would it be an idea to add a comment here? It might be difficult to understand for some readers

elif isinstance(output, torch.Tensor):
output = output.to(self.device)
return output
Expand Down
12 changes: 0 additions & 12 deletions src/anomalib/callbacks/normalization/__init__.py

This file was deleted.

29 changes: 0 additions & 29 deletions src/anomalib/callbacks/normalization/base.py

This file was deleted.

128 changes: 0 additions & 128 deletions src/anomalib/callbacks/normalization/min_max_normalization.py

This file was deleted.

78 changes: 0 additions & 78 deletions src/anomalib/callbacks/normalization/utils.py

This file was deleted.

Loading
Loading