Skip to content

Fix NaN issue for Learner1D R -> R^n #340

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 13, 2021

Conversation

Davide-sd
Copy link
Contributor

@Davide-sd Davide-sd commented Dec 13, 2021

Description

If a function evaluates over a point where it is not defined, NaN will be returned.
In the case of a function R -> R^n, it caused _update_scale to compute self._scale[1] = nan,
which in turns led to the evaluation of the function on all other points to be NaN, causing an
infinite loop, as reported in issue #339.

Using np.nanmin and np.nanmax in place of np.min and np.max seems to solve the issue.

Fixes #339

Checklist

  • Fixed style issues using pre-commit run --all (first install using pip install pre-commit)
  • pytest passed

Type of change

Check relevant option(s).

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • (Code) style fix or documentation update
  • This change requires a documentation update

If a function evaluates over a point where it is not defined, NaN will be returned.
In the case of a function R -> R^n, it caused `_update_scale` to compute `self._scale[1] = nan`,
which in turns led to the evaluation of the function on all other points to be NaN, causing an
infinite loop, as reported in issue python-adaptive#339.

Using `np.nanmin` and `np.nanmax` in place of `np.min` and `np.max` seems to solve the issue.
@basnijholt basnijholt enabled auto-merge (squash) December 13, 2021 13:00
@basnijholt
Copy link
Member

Thanks a lot for the fix @Davide-sd!

Copy link
Member

@basnijholt basnijholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@Davide-sd
Copy link
Contributor Author

Thanks @basnijholt .

I've run all tests twice on my local machine. In the first run everything appeared to be correct. On the second run the adaptive/tests/test_runner.py::test_ipyparallel_executor failed. Not sure why...

@basnijholt basnijholt disabled auto-merge December 13, 2021 13:40
@basnijholt basnijholt merged commit 8d4263a into python-adaptive:master Dec 13, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Learner1D R -> R^n - Infinite loop
2 participants