Skip to content

port scalarmath tests #16

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from
Closed

port scalarmath tests #16

wants to merge 3 commits into from

Conversation

ev-br
Copy link
Collaborator

@ev-br ev-br commented Jan 6, 2023

Make minimal modifications to dtype/scalar type machinery to get scalar math tests run. Lots of xfails/test skips for now, but I think this is more or less it for dtypes. This PR is against the gh-12 branch, these two PRs are a single "stack", as it were.

Next up is ufuncs, esp those which are also ndarray methods.

@lezcano lezcano self-requested a review January 10, 2023 16:38
Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A general question. Can you remind me why we need to explicitly have a base member within _ndarray? What use cases does it cover?

return asarray(self._tensor.imag)
except RuntimeError:
zeros = torch.zeros_like(self._tensor)
return ndarray._from_tensor_and_base(zeros, None)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not simply return asarray(zeros)?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At some point we'll need to rationalize these two forms, agree.
Basically, asarray is anything array-like in, array out; here we explicitly construct the tensor, so my fingers naturally typed this line. The line above, with asarray(self._tensor.imag), should be changed to follow line 94.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, asarray(Tensor) has the same semantics as _from_tensor_and_base(Tensor, None), so we can decide to always prefer the first one over the latter one for conciseness and consistency.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not always

In [2]: a = np.zeros(3)

In [3]: np.asarray(a) is a
Out[3]: True

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note that I suggested doing so when using torch.Tensors.

@ev-br
Copy link
Collaborator Author

ev-br commented Jan 11, 2023

A general question. Can you remind me why we need to explicitly have a base member within _ndarray? What use cases does it cover?

In numpy it's, AFAIU, a python side of the pointer chain for ownership of the array data:

In [1]: import numpy as np

In [2]: a = np.arange(4)

In [3]: a[::2].base is a
Out[3]: True

In [4]: a[::2][1] = 8

In [5]: a
Out[5]: array([0, 1, 8, 3])

So if arr is a view of some other array, a.base points to it. (Here view is numpy view, not pytorch'es).
Since it's public in numpy, why would we not keep it as well.

@ev-br
Copy link
Collaborator Author

ev-br commented Jan 11, 2023

Review comments addressed in 2b95ac2

@lezcano
Copy link
Collaborator

lezcano commented Jan 12, 2023

About #16 (comment), sure, but this tracking also happens in torch.tensor()._base, so why not just return that one?

Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should discuss asarray vs _from_tensor_and_base (probably remove the latter one) and potentially removing the base attribute from the wrapper. Now, we don't need to block this PR on that.

ev-br added a commit that referenced this pull request Jan 13, 2023
@ev-br
Copy link
Collaborator Author

ev-br commented Jan 13, 2023

Thanks for the review Mario!
Merged in f63c6bb

@ev-br ev-br closed this Jan 13, 2023
@ev-br ev-br deleted the scalarmath branch January 13, 2023 06:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants