![]() |
|
![]() |
| JavaScript is probably the most notable example of that. It used to not have a guaranteed iteration order, but browsers implemented it in such a way that the iteration order was the insertion order, and then that eventually got standardized because websites started depending on it.
For general purpose hash maps in standard libraries, I think you ought to either randomize the iteration order so that it's different every time, or guarantee an iteration order. Leaving it unspecified but predictable in practice is a recipe to fall victim to Hyrum's Law (https://www.hyrumslaw.com/). |
![]() |
| Ah, sorry I didn't read that closely.
One minor nit (which the Perl press releases also mess up): the randomisation is per-interpreter, not per-process. That's not a pedantic distinction. I've seen a couple of bugs/bad behaviors caused by forking servers forgetting to call srand(3)/re-randomize the hash seed after fork(2) and then relying on more randomness than they actually have. Suddenly (for example) hashing rate limiters or bloom filters all operate in near-lockstep, which can cause significant issues at high volumes. Forking has also caused randomness-related issues (though not necessarily specifically re: hashing) for Rust[1] and Ruby[2], and probably many other platforms. OpenSSL seems to sidestep[3] the issue by using the PID as part of its salt internally. 1: https://github.com/rust-lang/rust/issues/16799 |
![]() |
| > The overhead is rarely critical in practice.
Depends; you add two extra pointers for each element, so your int → int hash table balloons in size. |
![]() |
| Your ex-boss made you angry and you left code that will make the life more difficult for whom? Only your ex-boss or more people? Who gets punished for what you perceive as one person's mistake? |
![]() |
| This seems like a case where a little more debugging would have saved time over brute force bisection. The logging to print component orders had to be done eventually anyway. |
![]() |
| Kudos on the debugging but also on that commit message. It managed to condense the cause and the fix into a couple of paragraphs. |
![]() |
| There has been the usual suspects creating drama because that's how they can make themselves important. People who care about getting things done rightfully ignore the crybully mob. |
![]() |
| Yes, but the model has asked that the community stops using this image. It should be pretty easy to find an image that can serve as a standard, and that no one objects to. |
![]() |
| Lena has publically stated: “But I retired from modeling a long time ago. It’s time I retired from tech, too.”
But that aside, the rest of your argument is just the fallacy of relative privation. |
![]() |
| Good, let's look at hard numbers!
Windows 3.1 came out in 1992. One of the highlights in the CPU world in 1992 was the launch of the Intel DX2 (https://en.wikipedia.org/wiki/Intel_DX2). It used an 800nm process node, ran at up to 66MHz, had 8k of cache, and was usually coupled with either 4 or 8 MB of RAM. Windows Vista came out in 2007. That's the year Intel released their Core 2 Quad (https://en.wikipedia.org/wiki/Intel_Core_2). It was a quad core, manufactured on a 45nm process node, running at up to 3.5GHz, with 256k of L1 cache and 8M of L2 cache. In this era, computers often had around 2 GB of RAM. So we're talking 4x the number of cores, 50x the clock speed, 256x the RAM, 1024x the cache. Benchmarks comparing the two are extremely difficult to find, because they're from completely different eras of computing; but I think it's pretty safe to say that your 10x is completely insignificant in comparison. |
![]() |
| It's an image used within imaging since the 70s. It's used because everybody uses it. It being part of an old Playboy centerfold isn't the relevant bit here. |
![]() |
| Afaik, Lena herself said she'd like her image to stop being used as a test image. And IEEE already retired its use.
Even if you think it's woke, there's good reason to respect the model's wish |
![]() |
| She later said she wanted people to stop using the picture. https://finchcompany.com/projects/losing-lena-trailer/ https://www.theguardian.com/technology/2024/mar/31/tech-publ...
> Forsén herself has also suggested that the photo should be retired. In 2019, she said she was “really proud” of the picture and she re-created the shot for Wired magazine, which called her “the patron saint of JPEGs”. But later that year, the documentary Losing Lena spearheaded the latest effort to encourage computer science to move on. “I retired from modelling a long time ago,” Forsén said on its release. “It’s time I retired from tech, too. We can make a simple change today that creates a lasting change for tomorrow. Let’s commit to losing me.” |
![]() |
| Imo the reasonable thing to do would be to assign a higher credibility to her opinion in the Wired article higher than her opinion in the activist documentary. |
![]() |
| What do you mean by that word? Does it mean insincere?
EDIT: Why the downvotes? I'm genuinely curious. I see the word thrown around a lot but I can't get a grasp on what it means |
![]() |
| I assume you were downvoted because downvoters would not believe your question was genuine.
It is a fact that in the past groups of people have been ostracized, ignored, paid less, acknowledged less, respected less than today based on their race, gender, sexuality, country, profession etc. This has been raised as an issue and for some years —perhaps decades— a counter-motion has been going on: openly promote/respect/acknowledge people that were previously demoted/non-respected/unacknowledged. The exaggerated examples of these counter-motions are called “woke”. Imagine that we would like to promote the role of ants in the environment because they were largely ignored in the past, so someone makes a movie where an ant beats by sheer physical strength a lion; that would definitely be “woke“. There are cases where people can disagree whether something is “woke”; for example, think a woman who travels in time to a patriarchal society centuries ago where women were considered property and part of the background and yet she acts in an independent, outspoken, audacious way to men around her without anyone punishing her. That could be called “woke”, but it depends on one's sense of exaggeration. Reactions against such exaggerations is called “anti-woke”. A great example IMO of a humorous “anti-woke” statement is the image included in the following link, which is a poster for an imaginary documentary: https://knowyourmeme.com/photos/2440971-netflix |
![]() |
| Calling this regressing to a dark age might be why some people choose not to care about such things. Her picture is already everywhere anyway and there's nothing offensive or disrespectful about it. |
![]() |
| What you're calling "nonsense" is being baseline decent human beings. That's not a headache, but if it was one, it'd be one that's worth the cost. |
![]() |
| I say "female" when I mean female and "non-male" when I mean non-male. In this instance I mean non-male. Please spare me the fake outrage over precise terminology. |
![]() |
| That's a funny way to write, "Oh thanks for pointing that out! I should've double-checked my facts before stating them so confidently."
Regardless, I forgive you. |
![]() |
| > Someone else in the thread said this is from 2021 but I can’t tell since neither the URL nor the page itself give a date.
The Git commits in the article indicate the date. |
It also very nicely prevents security issues, since if the hashing algorithm is fixed, it can be exploited for denial of service by coming up with keys that all fall into the same bucket.