Thanks @seberg, more interesting food for thought.
It’s not quite that simple, there are multiple design axes here. We knew this is an important discussion, so @IvanYashchuk started a separate thread on this: Default dispatching behavior for supporting multiple array types across SciPy, scikit-learn, scikit-image.
Also remember, NEP 37 was written because the “fully implicit, always on” model of __array_function__
was deemed non-ideal by its original author.
Could you review that other thread? I think it may make more sense to continue there, at least to discuss the desired user-observable behavior.
This is incorrect. I think you are assuming that libraries like CuPy use asarray
to consume non-native array types. That is not the case - NumPy is the only library that does this (unfortunately). No matter the registration order, a CuPy array will end up with the CuPy backend here, a PyTorch tensor with the PyTorch backend, etc.
I believe you can make this work, but I don’t fully understand yet based on this sentence. Should the ABC here be inherited from by the duck array to make that work? Then that makes sense, it basically imposes a class relationship where today we don’t have one. That’d indeed be a reasonable thing to do/require.