Skip to content

Target function returns NaN #435

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
bonh opened this issue Jun 6, 2023 · 5 comments
Closed

Target function returns NaN #435

bonh opened this issue Jun 6, 2023 · 5 comments

Comments

@bonh
Copy link

bonh commented Jun 6, 2023

I have a complex target function that involves a nonlinear optimization under the hood. The optimizer might fail. What should the target function return to indicate to adaptive that this region of the parameter space is no good? np.nan ?

@basnijholt
Copy link
Member

basnijholt commented Jun 6, 2023

You probably need a custom loss function. Where if a NaN is returned, you set the region to 0 loss, meaning it won't be picked again.

Which learner are you using?

@bonh
Copy link
Author

bonh commented Jun 6, 2023

Okay, sounds good, I'll try that.

I'm using LearnerND for $R^3 \rightarrow R^1$.

@basnijholt
Copy link
Member

basnijholt commented Jun 6, 2023

OK, then try something like:

def custom_loss(simplex, values, value_scale):
    if any(np.isnan(v) for v in values):
        return 0.0
    return adaptive.learner.learnerND.default_loss(simplex, values, value_scale)

learner = adaptive.LearnerND(..., loss_per_simplex=custom_loss)

Note that I didn't test this 😅

@basnijholt
Copy link
Member

basnijholt commented Jun 6, 2023

Maybe you just want to make it less likely that something in that area gets chosen. You can do that with

def custom_loss(simplex, values, value_scale):
    loss = adaptive.learner.learnerND.default_loss(simplex, values, value_scale)
    if any(np.isnan(v) for v in values):
        return loss / 10
    return loss

learner = adaptive.LearnerND(..., loss_per_simplex=custom_loss)

@bonh
Copy link
Author

bonh commented Jun 21, 2023

That seems to work - thanks!

@bonh bonh closed this as completed Jun 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants