You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This RFC proposes adding support to the array API specification for repeating each element of an array.
Overview
Based on array comparison data, the API is available in most array libraries. The main exception is PyTorch which deviates in its naming convention (repeat_interleave vs NumPy et al's repeat).
repeats: the number of repetitions for each element.
If axis is not None,
if repeats is an array, repeats.shape must broadcast to x.shape[axis].
if repeats is a sequence of ints, len(repeats) must broadcast to x.shape[axis].
if repeats is an integer, repeats must be broadcasted to match the size of a specified axis.
If axis is None,
if repeats is an array, repeats.shape must broadcast to prod(x.shape).
if repeats is a sequence of ints, len(repeats) must broadcast to prod(x.shape).
if repeats is an integer, repeats must be broadcasted to match the size of the flattened array.
axis: specifies the axis along which to repeat values. If None, use a flattened input array and return a flat output array.
Questions
Both PyTorch and JAX support a kwarg for specifying the output size in order to avoid stream synchronization (PyTorch) and to allow compilation (JAX). Without such kwarg support, is this API viable? And what are the reasons for needing this kwarg when other array libraries (TensorFlow) omit such a kwarg?
When flattening the input array, flatten in row-major order? (precedent: nonzero)
Is PyTorch okay adding a repeat function in its main namespace, given the divergence in behavior for torch.Tensor.repeat, which behaves similar to np.tile?
This RFC proposes adding support to the array API specification for repeating each element of an array.
Overview
Based on array comparison data, the API is available in most array libraries. The main exception is PyTorch which deviates in its naming convention (
repeat_interleavevs NumPy et al'srepeat).Prior art
Proposal
repeats: the number of repetitions for each element.
If
axisis notNone,repeatsis an array,repeats.shapemust broadcast tox.shape[axis].repeatsis a sequence of ints,len(repeats)must broadcast tox.shape[axis].repeatsis an integer,repeatsmust be broadcasted to match the size of a specifiedaxis.If
axisisNone,repeatsis an array,repeats.shapemust broadcast toprod(x.shape).repeatsis a sequence of ints,len(repeats)must broadcast toprod(x.shape).repeatsis an integer,repeatsmust be broadcasted to match the size of the flattened array.axis: specifies the axis along which to repeat values. If
None, use a flattened input array and return a flat output array.Questions
nonzero)repeatfunction in its main namespace, given the divergence in behavior fortorch.Tensor.repeat, which behaves similar tonp.tile?int,List, andTuplefor repeats, not an array. PyTorch may prefer a list ofints(see Unnecessary cuda synchronizations that we should remove in PyTorch pytorch/pytorch#108968).Related
numpy.repeatto avoid repeated invocations: ENH: Allow tuple arguments fornumpy.repeatnumpy/numpy#21435 and ENH: Introduce multiple pair parameters in the 'repeat' function numpy/numpy#23937.repeat: Common APIs across array libraries (1 year later) #187 (comment)