[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wee_alloc is unmaintained? #1394

Closed
djc opened this issue Aug 30, 2022 · 7 comments · Fixed by #1400
Closed

wee_alloc is unmaintained? #1394

djc opened this issue Aug 30, 2022 · 7 comments · Fixed by #1400
Labels
Unmaintained Informational / Unmaintained

Comments

@djc
Copy link
Contributor
djc commented Aug 30, 2022

See https://www.reddit.com/r/rust/comments/x1cle0/dont_use_wee_alloc_in_production_code_targeting/ and rustwasm/wee_alloc#107.

@pinkforest
Copy link
Contributor
pinkforest commented Aug 30, 2022

884,115 downloads all time, ~2k a day

Hm looks like so. I'll try to do last desperate email to maintainer and ask what we should do with it

@fitzgen seems active in GitHub - we could nudge people to switch something maintained or would you be happy for people to still use the crate ? cc/ @pepyakin @DrGoldfire (bitbucket.org/DrGoldfire) @ZackPierce (other crate owners)

If we were to go nudging people what we should recommend as alternative(s) ? e.g. actionable fix

So we can just tell them to use std which is https://github.com/alexcrichton/dlmalloc-rs and what else hmm..

People seem to be mainly hitting that memory leak - also:
rustwasm/wee_alloc#85
rustwasm/wee_alloc#106

Silver lining when things go unmaintained people like tend to come up with research projects like lol alloc:
rustwasm/wee_alloc#107 (comment)

@Craig-Macomber how did you find the other alternative(s) when you did research around your lol_alloc ?

@pinkforest
Copy link
Contributor

@djc want to do a PR or I can do later ? Seems we can flag unmaintained on it rustwasm/wee_alloc#107 (comment) .. just need some alternative(s) and maybe we can mention lol_alloc as a honorable research project

@djc
Copy link
Contributor Author
djc commented Aug 30, 2022

I'm not involved enough to do a PR, just a drive-by issue because of the Reddit thread.

@pinkforest
Copy link
Contributor
pinkforest commented Aug 30, 2022

@MaxGraey @bkolobara @jsoverson @pkedy how do you feel about non-std allocators under wasm32 targets ?

I mean it just works right 🤷‍♀️ just curious around the needs / wants / other things out there as you've been deep end on this - I mean no_std the major selling point wouldn't be that big in wasm32 targets ?

I'm looking at the biggest consumer of this lib
https://crates.io/crates/parity-util-mem (7 months ago - 1,481,950 downloads, ~3k a day)

@dvdplm might have some ideas around this if non-std alternative allocator has use-cases today in wasm32 targets ?

PR's up: #1400

This was referenced Aug 30, 2022
@bkolobara
Copy link

@MaxGraey @bkolobara how do you feel about non-std allocators under wasm32 targets ?

I only used the standard allocator and it was good enough for all my stuff (wasm on the backend).

@pepyakin
Copy link

I would say that wee_alloc is pretty much unmaintained, as it seems I and pretty much every other developer moved on. I would've recommended the users move to something else, but I don't have a good replacement in mind (besides dlmalloc which is the default).

I'm looking at the biggest consumer of this lib
https://crates.io/crates/parity-util-mem (7 months ago - 1,481,950 downloads, ~3k a day)

I think this is incidental. Substrate used to compile to w32-u-u, but not anymore since there is smoldot: it relies on the default allocator AFAIK.

@jsoverson
Copy link

I can echo the other comments. There's value in wee_alloc (et al) but only if it's progressing in step with the community, rust, and WebAssembly. If it's not, people should use the standard allocator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Unmaintained Informational / Unmaintained
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants