Thank you for your answer,

Yeah, I was unsure if it ever existed in the first place.

Space is less of an issue and something like `a.data.hex()` would be fine as 
long as its speed was on par with `a.tobytes()`. However, it is 10x slower on 
my machine.

This e-mail is pretty much a final check (after a fair bit of research and 
attempts) that it can not be done so I can eliminate this possibility as 
feasible and concentrate on other options.

Regards,
DG

> On 25 Feb 2024, at 20:30, Robert Kern <robert.k...@gmail.com> wrote:
> 
> On Sat, Feb 24, 2024 at 7:17 PM Dom Grigonis <dom.grigo...@gmail.com 
> <mailto:dom.grigo...@gmail.com>> wrote:
> Hello,
> 
> I am seeking a bit of help.
> 
> I need a fast way to transfer numpy arrays in json format.
> 
> Thus, converting array to list is not an option and I need something similar 
> to:
> a = np.ones(10000000)
> %timeit a.tobytes()
> 17.6 ms
> This is quick, fast and portable. In other words I am very happy with this, 
> but...
> 
> Json does not support bytes.
> 
> Any method of subsequent conversion from bytes to string is number of times 
> slower than the actual serialisation.
> 
> So my question is: Is there any way to serialise directly to string?
> 
> I remember there used to be 2 methods: tobytes and tostring. However, I see 
> that tostring is deprecated and its functionality is equivalent to `tobytes`. 
> Is it completely gone? Or is there a chance I can still find a performant 
> version of converting to and back from `str` type of non-human readable form?
>  
> The old `tostring` was actually the same as `tobytes`. In Python 2, the `str` 
> type was what `bytes` is now, a string of octets. In Python 3, `str` became a 
> string a Unicode characters (what you want) and the `bytes` type was 
> introduced for the string of octects so `tostring` was merely _renamed_ to 
> `tobytes` to match. `tostring` never returned a string of Unicode characters 
> suitable for inclusion in JSON.
> 
> AFAICT, there is not likely to be a much more efficient way to convert from 
> an array to a reasonable JSONable encoding (e.g. base64). The time you are 
> seeing is the time it takes to encode that amount of data, period. That said, 
> if you want to use a quite inefficient hex encoding, `a.data.hex()` is 
> somewhat faster than the base64 encoding, but it's less than ideal in terms 
> of space usage.
> 
> -- 
> Robert Kern
> _______________________________________________
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: dom.grigo...@gmail.com

_______________________________________________
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com

Reply via email to