-
Notifications
You must be signed in to change notification settings - Fork 98
[ML] update inference configuration options for storage and inference #1600
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] update inference configuration options for storage and inference #1600
Conversation
specification/ml/_types/inference.ts
Outdated
text_embedding?: TextEmbeddingInferenceUpdateOptions | ||
} | ||
|
||
export class PipelineAggInferenceConfigUpdateContainer { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pipeline aggregation only allows regression and classification currently.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The changes look good, but there are a few type renames, which might be a problem for some clients.
Please @elastic/es-clients take a look.
@delvedor For the .NET client, the renames are fine as we're in alpha and haven't yet generated the ML APIs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this!
It looks like you've got <ModelType>Options
classes and <ModelType>UpdateOptions
classes which mostly match. Instead you should switch these to use the @overloadOf
behavior & convention where the classes that are used for requests can be named <ModelType>Options
and the classes for responses are named: <ModelType>OptionsRead
. Does this construction work for the case we're covering here?
Also going to tag @swallez since this likely will mean breaks in Java due to renaming, wanted your thoughts as well.
specification/ml/_types/inference.ts
Outdated
text_embedding?: TextEmbeddingInferenceUpdateOptions | ||
} | ||
|
||
export class PipelineAggInferenceConfigUpdateContainer { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's add a /** @variants container */
decorator here too
specification/ml/_types/inference.ts
Outdated
text_embedding?: TextEmbeddingInferenceUpdateOptions | ||
} | ||
|
||
export class PipelineAggInferenceConfigUpdateContainer { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's add a /** @variants container */
decorator here too
@swallez @sethmlarson I can adjust the naming back, but its "non-optimal" IMO as its not descriptive of what the type means. IDK how ts file location adjusts the generated types, but it would be good to have all the inference objects in the same place (logically). |
From what I can tell, |
@@ -157,41 +161,14 @@ export class InferenceAggregation extends PipelineAggregationBase { | |||
inference_config?: InferenceConfigContainer | |||
} | |||
|
|||
export class InferenceConfigContainer { | |||
/** @variants container */ | |||
class InferenceConfigContainer { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a review comment, but this name is unfortunate as it's in the global namespace, would've been better as InferenceAggregationInferenceConfigContainer
which would be a breaking change but allows using InferenceConfigContainer
within ml/_types/inference
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this PR, it's very nice to see contributions to the spec from the server dev team!
I'm not overly concerned by the renaming even if it causes a breaking change as there are a lot of data structure refinements that would require some code changes anyway. Also I may be mistaken but I haven't heard yet of Java client users going deep in ML APIs, so the impact should be fairly limited.
So LGTM from me once the other issues reported have been resolved.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this!
…#1600) This updates the inference configuration options for: - classification - regression - nlp tasks All configurations have a stored type and an update type. It may or may not be the case that the update version contains a smaller number of allowable fields.
This updates the inference configuration options for:
All configurations have a stored type and an update type. It may or may not be the case that the update version contains a smaller number of allowable fields.