-
Notifications
You must be signed in to change notification settings - Fork 68
Adds supports for SBPLX optimizer from nlopt #228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
It seems there are some failing tests of parts of the code that I never touched and that are not at all related to the suggested code changes... How shall I treat them? |
For the failures, the
Basically a spellchecker is run over the text that becomes part of the documentation. To correct this problem just add the two words that is flagging, i.e. In terms of adding this. When the optimizers were created the NLOpt package was added for its Global optimizers. One can always add more but we left it like that. If this optimizer really seems like it's worthwhile I guess it can be added. As to whether it's beneficial to add more from NLopt it has not been a concern so far. I will note that Qiskit Machine Learning no longer depends on Qiskit Algorithms qiskit-community/qiskit-machine-learning#817 and Qiskit Optimization has a similar PR in progress qiskit-community/qiskit-optimization#654 primarily in response to this 4247d06 So while you can add them here ML now has its own local copies of them with Opt doing the same. As this PR adds a new feature it needs a release note. See https://github.com/qiskit-community/qiskit-algorithms/blob/main/CONTRIBUTING.md#release-notes for details on how to do this. Also the optimizer needs including in the index of the documentation for the optimizers so it can be found in the API ref that gets published. This is the file https://github.com/qiskit-community/qiskit-algorithms/blob/main/qiskit_algorithms/optimizers/__init__.py which you can see published here https://qiskit-community.github.io/qiskit-algorithms/apidocs/qiskit_algorithms.optimizers.html to see how it looks. Now if you look at the Global Optimizers bit it has text around the dependence on NLOpt and how to install it. As the addition here is a local optimizer I believe my take would be to add a new list (of just this one) after the scikit-quant ones and prefix it by text saying something like this is dependent on NLOpt and see the Global Optimizers below, which are also dependent on that, for installation instructions. And this new optimizer needs to be added to the unit tests so the code is checked by CI. You can see the NLOpt ones that are currently included done here
|
Pull Request Test Coverage Report for Build 14974836240Details
💛 - Coveralls |
Thanks for your detailed answer. Looking through the links and the qiskit-machine-learning repo, it seems that there are multiple places within the qiskit repos with these optimization algorithms... Is qiskit-algorithms or qiskit-machine-learning actually the intended place for updated versions of the optimizers? So, should I try to add these changes also to the other repo? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Still quite some checks are failing though it seems they are not related to my changes (at least I successfully checked most stuff locally).
Please be aware that the code here is tested nightly in CI - you can see the outcome under Actions for these where you can see the only failures are in the Monitoring Main branch, which as I explained we now expect to occur as the code here depends on V1 primitives that exist there no more.
As far as I can tell the changes suggested below should fix things up.
Looking through the links and the qiskit-machine-learning repo, it seems that there are multiple places within the qiskit repos with these optimization algorithms... Is qiskit-algorithms or qiskit-machine-learning actually the intended place for updated versions of the optimizers? So, should I try to add these changes also to the other repo?
If you look at the readme in this repo you will see that it states this code is no longer maintained by IBM. In fact there is no designated maintainer at all unlike the applications, such as Qiskit Machine Learning which you can read about, if you like, here https://medium.com/qiskit/a-new-chapter-for-qiskit-algorithms-and-applications-5baff541e826
As algorithms is not officially maintained and only parts of it are used in ML and Opt the maintainers that are active in those decided to copy over the dependent pieces so they can maintain them there and are now independent of qiskit algorithms. The ML copy has been completed with Opt not yet completed. Whether they would additional optimizer(s) I cannot say. As I said it was chosen here just to have the Global ones from NLOpt but its a choice. If you would want to do the same thing there I would just open an issue and see what the response is before doing anything further - you can refer to this PR is you like so they can see what it entails. Once this passes here it should be relatively simple to mirror the changes elsewhere.
Co-authored-by: Steve Wood <[email protected]>
Co-authored-by: Steve Wood <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for the contribution and updating things based on feedback.
Thanks again - I had waited a couple of days before merging this in case any of the other reviewers had any feedback. |
Summary
The SBPLX algorithm is an extension to Nelder-Mead and expected to yield better results.
Thus, it might be beneficial to add the interface to the NLopt implementation of SBPLX.
Details and comments
It might be beneficial to add other NLopt optimizers as well. Feel free to comment whether I should do that.
Closes #227