![]() |
|
![]() |
|
In your case it makes sense to get the most performant machine you can get even if it means you're paying a ton more for marginal gains. This is not usually true for the general public.
|
![]() |
|
What an oddly aggressive and hostile response to such a banal observation. Yes, millions of people use software they hate, all the time, that’s wildly uncontroversial.
|
![]() |
|
I always likened "engineers"[1] to "people who are proficient in calculus"; and "computers"[1] to "people who are proficient at calculations". There was brief sidestep from late 1980s to early 2010s (~2012) where the term "software engineer" came into vogue and completely ran orthogonal to "proficiency in calculus". I mean, literally 99% of software engineers never learned calculus! But it's nice to see that ever since ~2015 or so (and perhaps even going forward) proficiency in calculus is rising to the fore. We call those "software engineers" "ML Engineers" nowadays, ehh fine by me. And all those "computers" are not people anymore -- looks like carefully arranged sand (silicon) in metal took over. I wonder if it's just a matter of time before the carefully-arranged-sand-in-metal form factor will take over the "engineer" role too. One of those Tesla/Figure robots becomes "proficient at calculus" and "proficient at calculations" better than "people". Reference: [1]: I took the terms "engineer" and "computer" literally out of the movie "Hidden Figures" https://en.wikipedia.org/wiki/Hidden_Figures#Plot It looks like ever since humankind learned calculus there was an enormous benefit to applying it in the engineering of rockets, aeroplanes, bridges, houses, and eventually "the careful arrangement of sand (silicon)". Literally every one of those jobs required learning calculus at school and applying calculus at work. |
![]() |
|
Hmm, that is an interesting take. Calculus does seems like the uniting factor. I've come to appreciate the fact that domain knowledge has a more dominant role in solving a problem than technical/programming knowledge. I often wonder how s/w could align with other engineering practices in terms of approach design in a standardized way so we can just churn out code w/o an excessive reliance on quality assurance. I'm really hoping visual programming is going to be the savior here. It might allow SMEs and Domain experts to utilize a visual interface to implement their ideas. Its interesting how python dominated C/C++ in the case of the NumPy community. One would have assumed C/C++ to be a more a natural fit for performance oriented code. But the domain knowledge overpowered technical knowledge and eventually people started asking funny questions like https://stackoverflow.com/questions/41365723/why-is-my-pytho... |
![]() |
|
That 99% guess seems high considering calculus is generally a required subject when studying computer science (or software engineering) at most universities I know of.
|
![]() |
|
Calculus is continuous, analog math. Digital Computers use discrete math. Both are math, and both are still incredibly important. Rockets haven't gone out of style. |
![]() |
|
>I find Linux having much fewer glitches using multiple screens. Maybe as long as you don't need working fractional scaling with different DPI monitors, which is nothing fancy now. |
![]() |
|
Unless he is doing Linux development in the first place, that sounds very weird. You most certainly don't need to set up WSL to lint Python or say JS in VSCode on Windows.
|
![]() |
|
> and best OS I do networking stuff and macOS is on par with Windows - I can't live on it without running into bugs or very questionable behavior for longer than a week. Same as Windows. |
![]() |
|
Arch works fairly well on Apple silicon now, though Fedora is easier/recomended.
Limited emulation due to the 16KB pages and no thunderbolt display out.
|
![]() |
|
Employers typically also care about costs like “how hard is it to provision the devices” and “how long is the useful life of this” or “can I repurpose an old machine for someone else”.
|
![]() |
|
> "I don't even know what you mean by mouse-select and paste." Presumably they mean linux-style text select & paste, which is done by selecting text and then clicking the middle mouse button to paste it (no explicit "copy" command). macOS doesn't have built-in support for this, but there are some third-party scripts/apps to enable it. For example: https://github.com/lodestone/macpaste |
![]() |
|
I really like the keyboards on my frameworks. I have both the 13 and the new 16, and they are pretty good. Not as good as the old T4*0s I'm afraid, but certainly usable.
|
![]() |
|
Remember, though, that the binaries deployed in production environments are not being built locally on individual developer machines, but rather in the cloud, as reproducible builds securely deployed from the cloud to the cloud. Modern language tooling (Go, Rust et al) allows one to build and test on any architecture, and the native macOS virtualization (https://developer.apple.com/documentation/virtualization) provides remarkably better performance compared to Docker (which is a better explanation for its fading from daily use). Your "trend" may, in fact, not actually reflect the reality of how cloud development works at scale. And I don't know a single macOS developer that "lean(s) on the team member who is on Linux" to leverage tools that are already present on their local machine. My own development environments are IDENTICAL across all three major platforms. |
![]() |
|
I don’t know a single engineer who had issues with M chips, and most engineers I know (me included) benefited considerably from the performance gains, so perhaps your niche isn’t that universal?
|
![]() |
|
Seen MLX folks post on X about nice results running local LLMs. https://github.com/ml-explore/mlx Also, Siri, and consider: you’re scaling AI on apple’s hardware, too, you can develop your own local custom AI on it, there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy. They scale the VRAM capacity with unified memory and that plus a ton of software is enough to make the Apple stuff plenty competitive with the corresponding NVIDIA stuff for the specific task of running big AI models locally. |
![]() |
|
I think US is their main market, though. The rest of the world prefers cheaper better phones and doesn't mind using WhatsApp for messaging, instead of iMessage.
|
![]() |
|
> Normal users will not do this. J Unfortunately a lot of the "freedom" crowd think that unless you want to be an 80s sysadmin you don't deserve security or privacy. Or computers. |
![]() |
|
I, an engineer, am not doing this myself, too. There is a middle ground though: just use a privacy-oriented Android build, like DivestOS. [1] There are a couple caveats: 1. It is still a bit tricky for a non-technical person to install. Should not be a problem if they know somebody who can help, though. There's been some progress making the process more user friendly recently (e.g. WebUSB-based GrapheneOS installer). 2. There are some papercuts if you don't install Google services on your phone. microG [2] helps with most but some still remain. My main concern with this setup is that I can't use Google Pay this way, but having to bring my card with me every time seems like an acceptable trade off to me. [2]: https://microg.org/ |
![]() |
|
WebGPU and many other features on iOS are unimplemented or implemented in half-assed or downright broken ways. These features work on all the modern desktop browsers and on Android tho! |
![]() |
|
> Do you have a legal right to write software or run your own software for hardware you bought? No, obviously not. Do you have a right to run a custom OS on your PS5? Do you have a right to run a custom application on your cable set-top box? Etc. Such a right obviously doesn’t exist and most people generally are somewhere between “don’t care” and actively rejecting it for various reasons (hacking in games, content DRM, etc). It’s fine if you think there should be, but it continues this weird trend of using apple as a foil for complaining about random other issues that other vendors tend to be just as bad or oftentimes even worse about, simply because they’re a large company with a large group of anti-fans/haters who will readily nod along. Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair? https://www.theverge.com/2022/5/21/23079058/apple-self-servi... |
![]() |
|
> Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair? Yes, I do. That was and continues to be a valid complaint, among all other anti-repair schemes Apple have come up with over the years. DRM for parts, complete unavailability of some commonly repaired parts, deliberate kneecapping of "Apple authorized service providers", leveraging the US customs to seize shipments of legitimate and/or unlabeled replacement parts as "counterfeits", gaslighting by official representatives on Apple's own forums about data recovery, sabotaging right to repair laws, and even denial of design issues[1] to weasel out of warranty repair just to name a few. All with the simple anti-competitive goal of making third party repair (both authorized and independent) a less attractive option due to artificially increased prices, timelines to repair, or scaremongering about privacy. https://arstechnica.com/gadgets/2022/12/weakened-right-to-re... https://www.pcgamer.com/ifixit-says-apples-iphone-14-is-lite... [1] Butterfly keyboards, display cables that were too short and failed over time |
![]() |
|
I’m not particularly scared of a 51% attack between the devices attached to my Apple ID. If my iPhone splits inference work with my idle MacBook, Apple TV, and iPad, what’s the problem there?
|
![]() |
|
Google operates in China albeit via their HK domain. They also had project DragonFly if you remember. The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them. Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/ As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android. |
![]() |
|
>Google operates in China albeit via their HK domain. The Chinese government has access to the iCloud account of every Chinese Apple user. >They also had project DragonFly if you remember. Which never materialized. >The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them. Apple does targeted and non targeted advertising as well. Additionally, your carrier has likely sold all of the data they have on you. Apple was also sued for selling user data to ad networks. Odd for a Privacy First company to engage in things like that. >Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/ Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month. >As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android. Not sure how that has anything to do with app quality, but if network traffic is your concern there's probably a lot more an Android user can do than an iOS user to control or eliminate the traffic. |
![]() |
|
> It has an aluminum case, 32GB RAM, an AMD Ryzen CPU that benchmarks similar to the M3, and 1TB SSD. How much does it weight? Battery life? Screen quality? Keyboard? Speakers? |
![]() |
|
Agreed. I'd definitely make the same arguments here as I would for an Audi. There's clearly a market, and that means they're not a bad value for a certain type of person.
|
![]() |
|
If people were actually rational that might be true, but they aren't. Apple survives entirely on the fact that they have convinced people they are cool, not because they actually provide good value.
|
![]() |
|
Really? You can find a laptop with the equivalent of Apple Silicon for $3-500? And while I haven't used Windows in ages I doubt it runs as well with 8 GB as MacOS does.
|
![]() |
|
I provide laptops to people from time to time. They expect to get a MacBook even if company is Windows and they don’t have any real arguments.
|
![]() |
|
So the new iPad & M4 was just some weekend project that they shrugged and decided to toss over to their physical retail store locations to see if anyone still bought physical goods eh
|
![]() |
|
Every company is selling one thing or another, and nothing is going to last forever. I really fail to see what, except for generic negativity, your comment adds to anything.
|
![]() |
|
They famously had a standoff with the US gov't over the Secure Enclave. Marketing aside, all indications point to the iOS platform being the most secure mobile option (imo). |
![]() |
|
> Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy. On the second part: https://en.wikipedia.org/wiki/IPadOS_version_history The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support). Apple has not, that I've seen at least, ever established a long term support policy on iPhones and iPads, but the numbers show they're doing at least as well as what Samsung and Google are promising to do, but have not yet done. And they've been doing this for more than a decade now. EDIT: Reworked the iOS numbers a bit, down to the month (I was looking at years above and rounding, so this is more accurate). iOS support time by device for devices that cannot use the current iOS 17 (so the XS and above are not counted here) in months:
The average is 72.5 months, just over 6 years. If we knock out the first 2 phones (both have somewhat justifiable short support periods, massive hardware changes between each and their successor) the average jumps to just shy of 79 months, or about 6.5 years.The 8 and X look like regressions, but their last updates were just 2 months ago (March 21, 2024) so still a good chance their support period will increase and exceed the 7 year mark like every model since the 5S. We'll have to see if they get any more updates in November 2024 or later to see if they can hit the 7 year mark. |
![]() |
|
> Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone. What a strange statement. I was late to the game with a Nexus S in 2010, and it was really usable. |
![]() |
|
Define usable. Imho before Nexus 4 everything was crap, Nexus 4 barely was enough (4x1.4 GHz), Nexus 5 (4x2.2GHz) plus software at the time (post-kitkat) was when it was really ready for mainstream
|
![]() |
|
>The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever. Everyone seems to have forgotten about the Celebrity iCloud photo leak. |
![]() |
|
The entire software stack is non-free and closed-source.
This means you'd be taking Apple at their word on "privacy". Do you trust Apple? I wouldn't, given their track record.
|
![]() |
|
From your link: > "Apple defines high-quality credits as those from projects that are real, additional, measurable, and quantified, with systems in place to avoid double-counting, and that ensure permanence." Apple then pledged to buy carbon credits from a company called Verra. In 2023, an investigation found that more than 90% of Verra's carbon credits are a sham. Notably, Apple made their pledge after the results of this investigation were known - so much for their greenwashing. https://www.theguardian.com/environment/2023/jan/18/revealed... |
![]() |
|
Mobile can be more efficient. But you're making big tradeoffs. You are very limited in what you can actually run on-device. And ultimately you're also screwing over your user's battery life, etc.
|
![]() |
|
I’ve found it to be pretty terrible compared to CUDA, especially with Huggingface transformers. There’s no technical reason why it has to be terrible there though. Apple should fix that.
|
![]() |
|
It doesn’t require specific hardware; you can train a neural net with pencil and paper if you have enough time. Of course, some pieces of hardware are more efficient than others for this.
|
![]() |
|
Except that's an unreasonable scenario for a smart phone. It doesn't prove that the minute the user goes online it won't be egressing data willingly or not.
|
![]() |
|
For everyone else who doesn't understand what this means, he's saying Apple wants you to be able to run models on their devices, just like you've been doing on nvidia cards for a while.
|
![]() |
|
I hope this means AI-accelerated frameworks get better support on Mx. Unified memory and Metal are a pretty good alternative for local deep learning development.
|
![]() |
|
So for hardware accelerated training with something like PyTorch, does anyone have a good comparison between Metal vs Cuda, both in terms of performance and capabilities?
|
![]() |
|
These don’t sit on the edge of the internet , and typically are not called edge devices. It’s usually a more powerful device such as a router or mini server between LAN and internet. |
![]() |
|
Honestly, if they manage this, they have my money.
But to get actually powerful models running, they need to supply the devices with enough RAM - and that's definitely not what Apple like to do.
|
![]() |
|
Prob beacuse they are like super-behind in the cloud space, it is not like they wouldn't like to sell the service. They ignored photos privacy quite a few times in the icloud.
|
![]() |
|
> Apple has never been privacy-first in practice
> They also make it a LOT harder than Android to execute your own MITM proxies I would think ease of MITM and privacy are opposing concerns |
![]() |
|
Yeah, given that they resisted putting RCS in iMessage so long, I am a bit skeptical about the whole privacy narrative. Especially when Apple's profit is at odds with user privacy.
|
![]() |
|
You know what would really help Apple customers increase their privacy when communicating with non-Apple devices? Having iMessage available to everyone regardless of their mobile OS. |
![]() |
|
> This is completely coherent with their privacy-first strategy Is this the same apple whose devices do not work at all unless you register an apple account? |
In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips it's a strategy that I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.