Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Optimize QUANTTransformer by using shape calculation in _fit method to avoid unnecessary computations #2727

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

shinymack
Copy link
Contributor

Reference Issues/PRs

Fixes #2722

What does this implement/fix? Explain your changes.

Used direct shape calculations for representation_functions in _fit method of QUANTTransformerinstead of unnecessary operations as only the output shapes are required to create the intervals.

Does your contribution introduce a new dependency? If yes, which one?

No

Any other comments?

The output correctly matches for the example given in the docstring.

PR checklist

For all contributions
  • I've added myself to the list of contributors. Alternatively, you can use the @all-contributors bot to do this for you after the PR has been merged.
  • The PR title starts with either [ENH], [MNT], [DOC], [BUG], [REF], [DEP] or [GOV] indicating whether the PR topic is related to enhancement, maintenance, documentation, bugs, refactoring, deprecation or governance.
For new estimators and functions
  • I've added the estimator/function to the online API documentation.
  • (OPTIONAL) I've added myself as a __maintainer__ at the top of relevant files and want to be contacted regarding its maintenance. Unmaintained files may be removed. This is for the full file, and you should not add yourself if you are just making minor changes or do not want to help maintain its contents.
For developers with write access
  • (OPTIONAL) I've updated aeon's CODEOWNERS to receive notifications about future changes to these files.

@aeon-actions-bot aeon-actions-bot bot added enhancement New feature, improvement request or other non-bug code enhancement transformations Transformations package labels Apr 4, 2025
@aeon-actions-bot
Copy link
Contributor

Thank you for contributing to aeon

I have added the following labels to this PR based on the title: [ $\color{#FEF1BE}{\textsf{enhancement}}$ ].
I have added the following labels to this PR based on the changes made: [ $\color{#41A8F6}{\textsf{transformations}}$ ]. Feel free to change these if they do not properly represent the PR.

The Checks tab will show the status of our automated tests. You can click on individual test runs in the tab or "Details" in the panel below to see more information if there is a failure.

If our pre-commit code quality check fails, any trivial fixes will automatically be pushed to your PR unless it is a draft.

Don't hesitate to ask questions on the aeon Slack channel if you have any.

PR CI actions

These checkboxes will add labels to enable/disable CI functionality for this PR. This may not take effect immediately, and a new commit may be required to run the new configuration.

  • Run pre-commit checks for all files
  • Run mypy typecheck tests
  • Run all pytest tests and configurations
  • Run all notebook example tests
  • Run numba-disabled codecov tests
  • Stop automatic pre-commit fixes (always disabled for drafts)
  • Disable numba cache loading
  • Push an empty commit to re-run CI checks

Copy link
Member

@MatthewMiddlehurst MatthewMiddlehurst left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks fine, but could you run this on a few datasets and endure that the output is the same still.

@shinymack
Copy link
Contributor Author

Looks fine, but could you run this on a few datasets and endure that the output is the same still.

Ok, i'll run this on some other datasets and post the results here.

@shinymack
Copy link
Contributor Author

Used this script for testing the output of the new one v/s the old one for different datasets and also different parameters.

from aeon.transformations.collection.interval_based import QUANTTransformer_old, QUANTTransformer
from aeon.testing.data_generation import make_example_3d_numpy
from aeon.datasets import load_italy_power_demand, load_classification

datasets = [
    ("Example", make_example_3d_numpy(n_cases=10, n_channels=1, n_timepoints=12, random_state=0)[0]),
    ("ItalyPowerDemand", load_italy_power_demand()[0]),
    ("ArrowHead", load_classification("ArrowHead")[0]),
    ("BasicMotions", load_classification("BasicMotions")[0]),
    ("Beef", load_classification("Beef")[0])
]

parameter_sets = [
    [2,8],[4,4],[1,2]
]

for dataset_name, X in datasets:
    print(f"\nDataset: {dataset_name}")
    print(f"Input shape: {X.shape}")

    for params in parameter_sets:
        print(f"\nParameters: interval_depth={params[0]}, quantile_divisor={params[1]}")

        q = QUANTTransformer(interval_depth=params[0], quantile_divisor=params[1])
        qo = QUANTTransformer_old(interval_depth=params[0], quantile_divisor=params[1])

        q.fit(X)
        qo.fit(X)

        qt = q.transform(X)
        qot = qo.transform(X)

        print("Old shape:", qot.shape)
        print("New shape:", qt.shape)

        diff = abs(qt - qot)
        if diff.max() == 0 and diff.min() == 0:
            print("Values: Same")
        else:
            print("Values max diff:", diff.max())
            print("Values mean diff:", diff.mean())

Got same result for all of them

Dataset: Example
Input shape: (10, 1, 12)

Parameters: interval_depth=2, quantile_divisor=8
Old shape: (10, 19)
New shape: (10, 19)
Values: Same

Parameters: interval_depth=4, quantile_divisor=4
Old shape: (10, 84)
New shape: (10, 84)
Values: Same

Parameters: interval_depth=1, quantile_divisor=2
Old shape: (10, 21)
New shape: (10, 21)
Values: Same

Dataset: ItalyPowerDemand
Input shape: (1096, 1, 24)

Parameters: interval_depth=2, quantile_divisor=8
Old shape: (1096, 32)
New shape: (1096, 32)
Values: Same

Parameters: interval_depth=4, quantile_divisor=4
Old shape: (1096, 164)
New shape: (1096, 164)
Values: Same

Parameters: interval_depth=1, quantile_divisor=2
Old shape: (1096, 42)
New shape: (1096, 42)
Values: Same

Dataset: ArrowHead
Input shape: (211, 1, 251)

Parameters: interval_depth=2, quantile_divisor=8
Old shape: (211, 280)
New shape: (211, 280)
Values: Same

Parameters: interval_depth=4, quantile_divisor=4
Old shape: (211, 1367)
New shape: (211, 1367)
Values: Same

Parameters: interval_depth=1, quantile_divisor=2
Old shape: (211, 439)
New shape: (211, 439)
Values: Same

Dataset: BasicMotions
Input shape: (80, 6, 100)

Parameters: interval_depth=2, quantile_divisor=8
Old shape: (80, 726)
New shape: (80, 726)
Values: Same

Parameters: interval_depth=4, quantile_divisor=4
Old shape: (80, 3438)
New shape: (80, 3438)
Values: Same

Parameters: interval_depth=1, quantile_divisor=2
Old shape: (80, 1050)
New shape: (80, 1050)
Values: Same

Dataset: Beef
Input shape: (60, 1, 470)

Parameters: interval_depth=2, quantile_divisor=8
Old shape: (60, 522)
New shape: (60, 522)
Values: Same

Parameters: interval_depth=4, quantile_divisor=4
Old shape: (60, 2563)
New shape: (60, 2563)
Values: Same

Parameters: interval_depth=1, quantile_divisor=2
Old shape: (60, 822)
New shape: (60, 822)
Values: Same

Copy link
Member

@MatthewMiddlehurst MatthewMiddlehurst left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature, improvement request or other non-bug code enhancement transformations Transformations package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[ENH] Representation functions calculation in fit are necessary in QUANTTranformer?
2 participants