Hi @skoudoro, yes there is a lot going on in many places at the moment, hard to follow along. I’d say uarray
didn’t make it over the finish line, mostly because of its implementation complexity. For the most recent status, it’s probably best to look at what SciPy and scikit-learn have actually merged and decided regarding support for PyTorch & co. There are a lot of PRs, and there are docs that are hopefully not too far behind:
- Support for the array API standard — SciPy v1.14.0.dev Manual
- 11.1. Array API support (experimental) — scikit-learn 1.4.2 documentation
If you are specifically interested in dealing with compiled code within DiPy rather than Python-level / array API standard code, then:
- SciPy is doing two things:
- library-specific support when there’s a matching API in PyTorch/CuPy (and any day now, JAX), just calling straight to that. E.g. if you pass a CuPy array or PyTorch tensor to
scipy.special.logit
, it callscupyx.scipy.special.logit
ortorch.special.logit
directly. - if there’s no matching API, then: convert to
numpy.ndarray
(on CPU, zero-copy), go through the compiled code, then convert back withxp.asarray
- library-specific support when there’s a matching API in PyTorch/CuPy (and any day now, JAX), just calling straight to that. E.g. if you pass a CuPy array or PyTorch tensor to
- scikit-learn is doing the second thing above as well I believe, and a scikit-learn specific pluging framework is in progress (not sure of the exact status).
If you have a more specific question I can probably point to some relevant PR or discussion.