Absolutely! Am adding it to my TODO list (if someone beats me to it, great).
One thing we should make a decision on is ENH: add testing support utilities · Issue #17 · data-apis/array-api-extra · GitHub — AFAIK array-api-extra
has grown its own set of xp_assert_*
routines. From the SciPy perspective, I suppose it’d great if we don’t have to maintain ours, so we might want to switch to those from array_api_extra
. I’ve to admit I’m not sure what is their status and whether they are drop-in replacements or not.
Meanwhile, let me quickly comment on
why check_0d is there, nor check_namespace
First and foremost, if you do not define SCIPY_ARRAY_API=1
env variable, most of these new arguments are not active, and xp_assert_close
is almost the same as npt.assert_allclose
, only slightly leaning towards being strict where numpy is lenient.
Assuming the env variable is defined, then
check_namespace
fixes this:
In [6]: from numpy.testing import assert_allclose
In [7]: assert_allclose(np.ones(3), torch.ones(3, dtype=torch.float64))
What if you want to be able to tell a torch tensor from a numpy array (with numerically identical values):
In [5]: xp_assert_close(np.ones(3), torch.ones(3, dtype=torch.float64), check_namespace=
⋮ False) # passes
In [4]: xp_assert_close(np.ones(3), torch.ones(3, dtype=torch.float64), check_namespace=
⋮ True)
---------------------------------------------------------------------------
AssertionError Traceback (most recent call last)
... snip
AssertionError: Namespace of actual and desired arrays do not match.
Actual: scipy._lib.array_api_compat.numpy
Desired: scipy._lib.array_api_compat.torch
My opinionated recommendation is to keep the default value and only switch it to False
if really hard pressed.
check_0d
is a painful one. I won’t try summarizing the long-winded discussions about it right now; the gist is whether you want to be able to tell a numpy scalar from a 0D array. Again, my opinionated suggestion would be to keep the default value unless really really hard pressed, but this strongly depends on the subpackage. scipy.stats
is IIUC one of those hard cases, where a lot of code historically follows numpy reductions—which return numpy scalars where everything else returns 0D arrays. So in scipy stats you can find an alternative import, from scipy._lib._array_api_0d
, which basically makes the default usable for scipy.stats.
A couple of (unsolicited, opinionated) bits from experience porting parts of scipy test suite:
- just port
assert_allclose
to use xp_assert_close
, without defining SCIPY_ARRAY_API variable. This is mostly mechanical, but not all of it, see below :-). This will generate a large diff, and we’ll have to go through tests twice for the Array API conversion, but all in all I found this easier to do in two steps.
- Again, without activating the SCIPY_ARRAY_API, change
assert_equal
into xp_assert_equal
. Again, mostly mechanical.
- Now, non-mechanical parts. These
xp_*
assertions are for arrays. Therefore, use regular assert
for python scalars and other non-array objects:
assert arr.shape == (2, 3) # not assert_equal(arr.shape, (2, 3))
assert math.isclose(arr.ndim / 2, 2.5, abs_tol=1e-15)
- change
assert_almost_equal
etc into their imports from scipy._lib._array_api
. This is mechanical.
- finally, can replace
assert_almost_equal
by their explicit xp_assert_close
analogs. By this time, it’s mechanical if tedious.