
START HERE. This is the top-level project for lib.rs that connects all the repositories together.

You don't need a car in Singapore. Very good public transport and affordable taxis.
For more brain flipping try looking into hardware description languages (Verilog) or proof assistants (Coq).
Is this a new project that was intentionally started in PHP or something legacy? Any interesting benchmarks? Like minimal wire to wire network processing time and where the bottleneck is?
I saw md5 checksum implemented in scratch.
I would like to hear more
Canonical make it hard not to use snaps so only those who took extra steps are not using them.
Yea, not with firefox
, at least not without switching to some third party repo.
Uses KDE and not available to general public.
If you open to do some research - you should be able to stay on Debian and use nix the package manager https://hachyderm.io/@fasterthanlime/111099224022408403
Comes with some steep learning curve, almost a learning cliff though. But once past this - it's good.
That's what I'm using, but it's not a fully featured replacement.
Been using it since forever, no reasons to switch. It works. Got a bit upset at them when they killed xul/pentadactyl though.
Yea, something like that. Using it on my laptop already. configuration.nix
for system plus home-manager for user stuff. Will move the desktop soon-ish.
NixOS. I'm going to migrate to NixOS by then.
https://github.com/nix-community/home-manager/issues/3968 - this to monitor and poke, there are also a link for a sample wrapper that might or night not help you.
You need this https://github.com/guibou/nixGL
Permanently Deleted
Don't use it - vote with your feet :)
Not quite. Suppose instead of a single version of serde
there's now 46 versions like in https://crates.io/crates/parquet to be able to use instances derived in some other crate X
you have to use the same version of serde
. Now, how should a crate decide which versions of serde
to support?
All 46 and all optional? Supporting that would be painful. Just the last one? crates.io is a cemetery full of dead crates, with this support strategy any handful of crates picked at random are not going to be serde-compatible with each other.
A better solution would be a better support for compile time reflection so serde doesn't have to exist in its current state, but that's got delayed (by big macro conspiracy :)
urxvt
. It works good enough and doesn't use much memory.
You can find answer to most of the questions in google. And there are always people who are willing to help in the internets.
Kind of like flatpacks but it's done with symlinks and fancy changes to the build systems. I think it fits better for the developer environment.
Libs.rs is now closed source
START HERE. This is the top-level project for lib.rs that connects all the repositories together.
Previously: https://lemmyrs.org/post/175672
I originally had sources and data of the site public, hoping they would be interesting to study, aid in bug reporting, bring contributions, and make site's algorithms transparent.
Instead, I got knee-jerk reactions about lines of code taken out of context. I got angry dogpiles from the Rust community, including rust-lang org members. I don't need to endure such mudslinging. Therefore, the sources are no longer available.
As of right now bitcoin
crate is not deprecated, instead libs.rs responds with error 502.
Fixing feature unification compilation time issues with cargo-hackerman
cross-posted from: https://lemmyrs.org/post/144635
If you have a workspace with dependencies you probably noticed that sometimes
cargo
seemingly unnecessary recompile external dependencies as you switch between different members of your workspace.This is caused by something called feature unification ([1]). Since features by design should be additive only
cargo
tries to avoid redundant work by using a superset of all required features. Problem comes when there are multiple crates in the workspace require external dependencies with different set of features. When you are working with the workspace as a whole - unified features include all the dependencies, when you target a single crate - unified features will include only features of that crate's dependencies.What's worse - if you are using
nix
withcrate2nix
to manage dependencies - you'll get no feature unification at all and every dependency with each unique combination of features is considered a separate
Fixing feature unification compilation time issues with cargo-hackerman
If you have a workspace with dependencies you probably noticed that sometimes cargo
seemingly unnecessary recompile external dependencies as you switch between different members of your workspace.
This is caused by something called feature unification ([1]). Since features by design should be additive only cargo
tries to avoid redundant work by using a superset of all required features. Problem comes when there are multiple crates in the workspace require external dependencies with different set of features. When you are working with the workspace as a whole - unified features include all the dependencies, when you target a single crate - unified features will include only features of that crate's dependencies.
What's worse - if you are using nix
with crate2nix
to manage dependencies - you'll get no feature unification at all and every dependency with each unique combination of features is considered a separate dependency so the same crate can be compiled (and linked in) mu
Experimenting with better CLI errors
What do you think about this kind of indication for conflicting or otherwise invalid arguments?
With command line arguments being 1D and line length valid up to hundreds of kilobytes only inline indication seems to work.
Would you change anything?
Fastest Luhn algorithm checksum on this street
One of the digits of your credit card number is not like the rest of them: it's purpose is to check for value correctness, this way any website or form knows what if checksum matches - the value was typed in correctly. Or you are lucky with a typo because it's not a very good checksum - it was designed to be easy to calculate on a mechanical device and been sticking around mostly for historical reasons.
To check if a number is valid according to Luhn checksum you would go from right to left, double every second digit, add all the digits together, if result ends with 0 - checksum matches, if it ends with some other value - it does not.
For example let's check the number 1594: write down the number as a bunch of digits
undefined
1 5 9 4
double every second digit from the right
undefined
2 5 18 4
add all the digits
undefined
2 + 5 + 1 + 8 + 4 = 20
ends with 0, so checksum is valid
Three key optimizations help to calculate it fast: