(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=40286029

您的评论提出了几点,我很乐意进一步讨论。 然而,保持建设性的对话并关注事实而不是假设或个人观点很重要。 让我们解决一些具体问题: 1. **隐私:** 虽然 Apple 将自己定位为一家隐私第一的公司,但必须承认他们的商业模式包括收集和利用用户数据来进行有针对性的广告和 iCloud 等服务。 然而,它们确实允许用户选择退出数据收集和共享。 2. **硬件和软件集成:** Apple 的集成硬件和软件方法可实现更流畅的用户体验和更简化的开发流程。 例如,他们的 M 系列芯片经过专门设计,可与其操作系统完美配合,从而提高性能和能源效率。 3. **模型蒸馏和联邦学习:** 包括苹果在内的许多公司正在探索模型蒸馏和联邦学习等技术,以在移动和边缘设备上启用人工智能功能。 这两种策略都旨在降低直接在这些设备上运行大型人工智能模型所需的计算复杂性,允许在本地进行推理,同时最大限度地减少对云资源的依赖。 4. **性能比较:** 在比较不同技术或硬件的性能时,必须考虑功耗、定价、与现有架构的兼容性以及整体用户体验等因素。 理想情况下,性能比较应该反映整体视角,而不是仅仅关注原始基准分数。 总的来说,我们的讨论围绕人工智能的未来及其在移动和边缘设备上的部署。 通过研究隐私问题、硬件优化、软件集成以及大型语言模型的作用等各个方面,我们可以更深入地了解这些技术将如何塑造我们未来的生活。 请记住,最终目标是促进社区成员之间富有成效和富有洞察力的思想交流。

相关文章

原文


Together with next-generation ML accelerators in the CPU, the high-performance GPU, and higher-bandwidth unified memory, the Neural Engine makes M4 an outrageously powerful chip for AI.

In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.

If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips it's a strategy that I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.



> In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

Their primary business goal is to sell hardware. Yes, they’ve diversified into services and being a shopping mall for all, but it is about selling luxury hardware.

The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.



> but it is about selling luxury hardware.

Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets. People spend hours a day on their phones, and often run their life and businesses through it. Even with the $1000+/2-3y price tag, it’s simply not that much given how central role it serves in your life. This is especially true for younger generations who often don't have laptops or desktops at home, and also increasingly in poorer-but-not-poor countries (say eg Eastern Europe). So the iPhone (their best selling product) is far, far, far more a commodity utility than typical luxury consumption like watches, purses, sports cars etc.

Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury. Especially since the M1 launched, where performance and battery life took a giant leap.



> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.

I am the one paying for my MacBook Pro, because my company is a self-funded business. I run my entire business on this machine and I love it. I always buy the fastest CPU possible, although I don't max out the RAM and SSD.

Amusingly enough, I talked to someone recently about compilation speeds and that person asked my why I don't compile my software (Clojure and ClojureScript) on "powerful cloud servers". Well, according to Geekbench, which always correlates very well with my compilation speeds, there are very few CPUs out there that can beat my M3 Max, and those aren't easily rentable as bare-metal cloud servers. Any virtual server will be slower.

So please, don't repeat the "MacBooks are for spoiled people who don't have to pay for them" trope. There are people for whom this is simply the best machine for the job at hand.

Incidentally, I checked my financials: a 16" MBP with M3 and 64GB RAM, amortized over 18 months (very short!) comes out to around $150/month. That is not expensive at all for your main development machine that you run your business on!



In your case it makes sense to get the most performant machine you can get even if it means you're paying a ton more for marginal gains. This is not usually true for the general public.


"Engineers" - ironically the term used in the software industry for people who never standardize anything, solve the same problem solved by other "engineers" over and over again (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?) while making the same mistakes and learning the hard way each time, also vehemently argue about software being "art" might like OSX, but even that is debatable. Meanwhile actual Engineers (the ones with the license) the people who need CAD and design tools for building bridges and running manufacturing plants stay far away from OSX.


I did EE in college but we mostly just used Windows because the shitty semi-proprietary SPICE simulator we had to use, and stuff like that, only supported Windows. The company that makes your embedded processor might only support Windows (and begrudgingly at that).

I think engineers using software should not be seen as an endorsement. They seem to have an incredible tolerance for bad UI.



>They seem to have an incredible tolerance for bad UI.

Irelevant.

Firstly, it's a tool, not a social media platform designed to sell ads and farm clicks, it needs to be utilitarian and that's it, like a power drill or a pickup truck, not look pretty since they're not targeting consumers but solving a niche set of engineering problems.

Secondly, the engineers are not the ones paying for that software so their individual tolerance is irelevant since their company pays for the tools and for their tolerance to those tools, being part of the job description and the pay.

Unless you run your own business , you're not gonna turn down lucrative employment because on site they provide BOSCH tools and GM trucks while you personally prefer the UX of Makita and Toyota. If those tools' UX slows down the process and makes the project take longer it's not my problem, my job is to clock in at 9 and clock out at 5, that's it, it's the company's problem to provide the best possible tools for the job, if they can.



You seem to be suggesting that a chunk of the hundreds of millions of people who use a UI that you don't like, secretly hate it or are forced to tolerate it. Not a position I'd personally want to argue or defend, so I'll leave it at that.


What an oddly aggressive and hostile response to such a banal observation. Yes, millions of people use software they hate, all the time, that’s wildly uncontroversial.


Making up what? Go drop by your nearby shop. My hair styling constantly complains about management software that they use and quality of payment integration. At work I constantly hear complaints about shitty, slow IDEs. At optician store guy been complaining about inventory system.

People hate software that they're forced to use. Professionals are better at tolerating crapware, because there's usually sunk cost fallacy involved.



If you look at creative pros such as photographers and Hollywood ‘film’ editors, VFX artists, etc. you will see a lot of Windows and Linux as people are more concerned about getting absolute power at a fair price and don’t care if it is big, ugly. etc.


Oh, I'm sure there are lots of creatives who use OSX, so I don't mean to suggest nobody uses OSX, so I'll admit it was a bit in jest to poke fun at the stereotype. I'm definitely oldschool - but to me It's a bit cringe to hear "Oh, I'm an engineer.." or "As an engineer.." from people sit at a coffee shop writing emails or doing the most basic s/w dev work. I truly think silicon valley people would benefit from talking to technical people who are building bridges and manufacturing plants and cars and hardware and chips and all this stuff on r/engineeringporn that everyone takes for granted. I transitioned from s/w to hardcore manufacturing 15 years ago, and it was eye opening, and very humbling.


You seem to have some romanticized notion of engineers and deeply offended by someone calling themselves engineer. Why do you even care if someone sits at a coffee shop writing emails and calls themselves engineer? You think it somehow dilutes prestige of word "engineer"? Makes it less elite or what?


I’d assume a lot of this is because you can’t get the software on MacOS. Not a choice. Who is choosing to use Windows 10/11 where you get tabloid news in the OS by default? Or choosing to hide the button to create local user accounts?


Who is choosing to use macOS, where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?


I do. Because for all issues it has, it is still much better than whatever Windows has to offer.

> where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?

At least my WiFi doesn't turn off indefinitely during sleep until I power cycle whole laptop because of a shitty driver.



The big difference is that Microsoft - at least usually - confirms and owns the issues.

With Apple, it's usually just crickets... nothing in the release notes, no official statements, nothing. It's just trial and error for the users to see if a particular update fixed the issue.



So the same software exists on multiple platforms, there are no legacy or hardware compatibility considerations, interoperability considerations, no budget considerations, and the users have a choice in what they use?

I.e the same functionality exists with no draw backs and money was no object.

And they chose Windows? Seriously why?



We use the sales metrics and signals available to us.

I don't know what to say except resign to the fact that the world is fundamentally unfair, and you won't ever get to run the A/B experiment that you want. So yes, Windows it is !



I always likened "engineers"[1] to "people who are proficient in calculus"; and "computers"[1] to "people who are proficient at calculations".

There was brief sidestep from late 1980s to early 2010s (~2012) where the term "software engineer" came into vogue and completely ran orthogonal to "proficiency in calculus". I mean, literally 99% of software engineers never learned calculus!

But it's nice to see that ever since ~2015 or so (and perhaps even going forward) proficiency in calculus is rising to the fore. We call those "software engineers" "ML Engineers" nowadays, ehh fine by me. And all those "computers" are not people anymore -- looks like carefully arranged sand (silicon) in metal took over.

I wonder if it's just a matter of time before the carefully-arranged-sand-in-metal form factor will take over the "engineer" role too. One of those Tesla/Figure robots becomes "proficient at calculus" and "proficient at calculations" better than "people".

Reference: [1]: I took the terms "engineer" and "computer" literally out of the movie "Hidden Figures" https://en.wikipedia.org/wiki/Hidden_Figures#Plot

It looks like ever since humankind learned calculus there was an enormous benefit to applying it in the engineering of rockets, aeroplanes, bridges, houses, and eventually "the careful arrangement of sand (silicon)". Literally every one of those jobs required learning calculus at school and applying calculus at work.



Why pointing out Calculus as opposed to just Math?

Might be just my Eastern Europe background where it was all just "Math" and both equations (that's Algebra I guess) and simpler functions/analysis (Calculus?) are taught in elementary school around age 14 or 15.

Maybe I'm missing/forgetting something - I think I used Calculus more during electrical engineering than for computer/software engineering.



In my central european university we've learned "Real Analysis" that was way more concerned about theorems and proofs rather than "calculating" something - if anything, actually calculating derivatives or integrals was a warmup problem to the meat of the subject.


True, we learnt calculus before college in my home country - but it was just basic stuff. But I learnt a lot more of it including partial derivatives in first year of engineering college.

>I think I used Calculus more during electrical engineering than for computer/software engineering.

I think that was OPs point - most engineering disciplines teach it.



Hmm, that is an interesting take. Calculus does seems like the uniting factor.

I've come to appreciate the fact that domain knowledge has a more dominant role in solving a problem than technical/programming knowledge. I often wonder how s/w could align with other engineering practices in terms of approach design in a standardized way so we can just churn out code w/o an excessive reliance on quality assurance. I'm really hoping visual programming is going to be the savior here. It might allow SMEs and Domain experts to utilize a visual interface to implement their ideas.

Its interesting how python dominated C/C++ in the case of the NumPy community. One would have assumed C/C++ to be a more a natural fit for performance oriented code. But the domain knowledge overpowered technical knowledge and eventually people started asking funny questions like

https://stackoverflow.com/questions/41365723/why-is-my-pytho...



That 99% guess seems high considering calculus is generally a required subject when studying computer science (or software engineering) at most universities I know of.


This checks out. I'm a software developer who took math all through high school and my first three years of college. I barely scraped through my calculus exams, but I excelled at combinatorics, probability, matrix math, etc. (as long as it didn't veer into calculus for some reason).

I guess I just enjoy things more when I can count them.



Most software engineering just doesn’t require calculus, though it does benefit from having the understanding of functions and limit behaviors that higher math does. But if you look at a lot of meme dev jobs they’ve transitioned heavily away from the crypto craze of the past 5 years towards “prompt engineering” or the like to exploit LLMs in the same way that the “Uber for X” meme of 2012-2017 exploited surface level knowledge of JS or API integration work. Fundamentally, the tech ecosystem desires low skill employees, LLMs are a new frontier in doing a lot with a little in terms of deep technical knowledge.


For this engineering, I think calculus is not the main proficiency enhancer you claim it to be. Linear Algebra, combinatorics, probability and number theory are more relevant.

Calculus was important during the world wars because it means we could throw shells to the enemy army better, and that was an important issue during that period.

Nowadays, calculus is just a stepping stone to more relevant mathematics.



Calculus is continuous, analog math. Digital Computers use discrete math.

Both are math, and both are still incredibly important. Rockets haven't gone out of style.



I challenge you to take those people who make bridges to build full software.

I am not meaning software is engineering or not.

It is a fact, in terms of cost, that software and bridge building are, most of the time very different activities with very different goals and cost-benefit ratios.

All those things count when taking decisions about the level of standardization.

About standards... there are lots also and widely used, from networking to protocols, data transfer formats... with well-known strengths and limitations.



I'd say a lot of engineers (bridges, circuit boards, injection mouldings) are kept far away from OSX (and linux). Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!


From what I've heard (not an OSX user) Windows is the best operating system for multiple screens; OSX and Linux glitch way more. Most anyone doing 3D sculpture or graphics/art on a professional level will eventually move to working with 2-3 screens, and since there are no exclusively Mac design programs, OSX will be suboptimal.

There's little things too, like some people using gaming peripherals (multi-button MMO mice and left hand controllers, etc.) for editing, which might not be compatible with OSX.

And also, if you're mucking around with two 32 inch 4k monitors and a 16 inch Wacom it might start to feel a little ridiculous trying to save space with a Mac Pro.



Besides Windows having more drivers for USB adapters than Linux*, which is a reflection of the market, I find Linux having much fewer glitches using multiple screens.

Once it works, Linux is more reliable than Windows. And virtual desktops have always worked better on Linux than on Windows. So I disagree with you on that front.

* In my case, this means I had to get an Anker HDMI adapter, instead of any random brand.



>I find Linux having much fewer glitches using multiple screens.

Maybe as long as you don't need working fractional scaling with different DPI monitors, which is nothing fancy now.



Have we done full circle?

When I started doing this "Internet stuff" we were called "webmasters", and job would actually include what today we call: - DevOps - Server/Linux sysadmin - DB admin - Full stack (backend and frontend) engineer

And I might have forgot some things.



1999 indeed! I haven't heard that term since around 1999 when I was hired as a "web engineer" and derisively referred to myself as a "webgineer". I almost asked if I could change my title to "sciencematician".


And they can typically setup their dev environment without a VM, while also getting commercial app support if they need it.

Windows requires a VM, like WSL, for a lot of people, and Linux lacks commercial support. macOS strikes a good balance in the middle that makes it a pretty compelling choice.



I was thinking more about software like the Adobe suite, Microsoft Office, or other closed source software that hasn’t released on Linux. Electron has made things a bit better, but there are still a lot of bigs gaps for the enterprise, unless the company is specifically choosing software to maintain Linux support for end users.

Sure, Wine exists, but it’s not something I’d want to rely on for a business when there are alternatives like macOS which will offer native support.



Most people don't need the Adobe Suite, and the web version of M$-Office is more than Ok for occasional use. Most other enterprise software are web apps too nowadays, so it's much less relevant what OS your machine is running than it was ten years ago...


Excel is essential and in most businesses that I worked with, most of the accounting and business side is run on it. I switched to Windows from Linux just because of Excel when WSL came out. If Linux would have Excel and Photoshop that would be a no brainer to choose it, but that will never happen


Yep, that's pretty much it.

Apple fanboys like to talk about how cool and long lasting a MacBook Air is but a 500 bucks Chromebook will do just as well while allowing pretty much 90% of the use cases. Sure, the top end power is much lower but at the same time considering the base RAM/storage combo Apple gives it is not that relevant. If you starting loading it up, that puts the pricing in an entirely different category and in my opinion the MacBook Air becomes seriously irrelevant when compared to serious computing devices in the same price range...



There's still a huge market for people who want higher end hardware and to run workloads locally, or put a higher price on privacy. For people who want to keep their data close to their chest, and particularly now with the AI bloom, being able to perform all tasks on device is more valuable than ever.

A Chromebook "does the job" but it's closer to a thin client than a workstation. A lot of the job is done remotely and you may not want that.



You usually don't need either for software development though, and if you do the free or online alternatives are often good enough for the rare occasions you need them. If you are a software developer and you have to spend significant time using Office it means you either are developing extensions for Office or your company management is somewhat lacking and you are forced to handle things you should not (like bureaucracy for instance).


↑ This!

I would love to buy Apple hardware, but not from Apple. I mean: M2 13 inch notebook with access to swap/extend memory and storage, regular US keyboard layout and proper desktop Linux (Debian, Alpine, Mint, PopOS!, Fedora Cinamon) or windows. MacOS and the Apple eco system just gets in your way when you're just trying to maintain a multi-platform C++/Java/Rust code base.



WSL for normal stuff. My co-worker is on Windows and had to setup WSL to get a linter working with VS Code. It took him a week to get it working the first time, and it breaks periodically, so he needs to do it all over again every few months.


Unless he is doing Linux development in the first place, that sounds very weird. You most certainly don't need to set up WSL to lint Python or say JS in VSCode on Windows.


I'm developing on Windows for Windows, Linux, Android, and web, including C, Go, Java, TSQL and MSSQL management. I do not necessarily need WSL except for C. SSH is built directly into the Windows terminal and is fully scriptable in PS.

WSL is also nice for Bash scripting, but it's not necessary.

It is a check box in the "Add Features" panel. There is nothing to install or setup. Certainly not for linting, unless, again, you're using a Linux tool chain.

But if you are, just check the box. No setup beyond VS Code, bashrc, vimrc, and your tool chain. Same as you would do on Mac.

If anything, all the Mac specific quirks make setting up the Linux tool chains much harder. At least on WSL the entire directory structure matches Linux out of the box. The tool chains just work.

While some of the documentation is in its infancy, the workflow and versatility of cross platform development on Windows, I think, is unmatched.



> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.

I know engineers from a FANG that picked MacBook pros in spite of the specs and only because of the bling/price tag. Them they spent their whole time using it as a remote terminal for Linux servers, and they still complained about the thing being extremely short on RAM and HD.

One of them even tried to convince their managers to give the vision pro a try, even though there was zero use cases for it.

Granted, they drive multiple monitors well with a single USB-C plug, at least with specific combinations of monitors and hubs.

It's high time that the "Apple sells high end gear" shtick is put to rest. Even their macOS treadmill is becoming tiring.



The build quality of Apple laptops is still pretty unmatched in every price category.

Yes, there are 2k+ laptops from Dell/Lenovo that match and exceed a similarly priced MacBook in pure power, but usually lack battery life and/or build quality.



> The build quality of Apple laptops is still pretty unmatched in every price category.

I owned a MacBook Pro with the dreaded butterfly keyboard. It was shit.

How many USB ports do the new MacBook air have? The old ones had two. And shipped with 8GB of RAM? These are shit-tier specs.

The 2020 MacBook pros had a nice thing: USB-C charging, and you could charge the from either side. Current models went back to MagSafe, only on one side. The number of USB ports is still very low.

But the are shiny. I guess that counts as quality.



Apple devices also work quite seamless together. IPads for example work great as a second screen wirelessly with the MBPs. I'd immediately buy a 14 inch ipad just for that, since that is so useful when not on your standard desk. Also copy paste between devices or headphones just work...

in case Apple would come up with the idea to take an ipad as external compute unit that would be amazing... just double your ram, compute and screen with it in such a lightweight form factor... should be possible if they want



I think relatively few corporations are offering Macs to people. It's all bog-standard POS Dells, with locked-down Windows images that often do not even allow you to change the screensaver settings or the background image, in the name of "security." I'd love to be wrong about that.


I don’t think it’s at all unreasonable for an engineer using a device for 8+ hours every day to pay an additional, say, 0.5% of their income (assuming very conservatively $100,000 income after tax, $1,000 extra for a MacBook, 2 year product lifespan) for the best built laptop, best screen, and best OS.


> and best OS

I do networking stuff and macOS is on par with Windows - I can't live on it without running into bugs or very questionable behavior for longer than a week. Same as Windows.



What stuff is weird? I have so far had very good experiences with Apple (although not iOS yet). Almost everything I do on my Linux workstation works on Mac too. Windows though is beyond horrible and different in every way.

> I do networking stuff

Me too, but probably very different stuff. I’m doing p2p stuff over tcp and am affected mostly by sock options, buffer sizes, tcp options etc.



And the M1 chip on mine really alters productivity. Every time we want to update a library, we need some kind of workaround.

It's great having a chip that is so much different than what our production infrastructure uses.



Not worth it at all. I rarely use battery power, so I'd rather have an intel or AMD chip with more cores and a higher clock speed at the expense of the battery. Oh, and an OS that can actually manage its windows, and customize keyboard settings, and not require an account to use the app store


Arch works fairly well on Apple silicon now, though Fedora is easier/recomended. Limited emulation due to the 16KB pages and no thunderbolt display out.


no, no, NO and yes.

I actually rejected a job offer when heard I will be given a macbook pro.

Apple, been the most closed company these days, should be avoided as much as you can, not to mention its macos is useless for linux developers like me, anything else is better.

its keyboard is dumb to me(that stupid command/ctrl key difference), can not even mouse-select and paste is enough for me to avoid Macos at all costs.



> its keyboard is dumb to me(that stupid command/ctrl key difference)

Literally best keyboard shortcuts out of all major OSes. I don't know what weird crab hands you need to have to comfortably use shortcuts on Windows/Linux. CMD maps PERFECTLY on my thumb.



> I actually rejected a job offer when heard I will be given a macbook pro.

For what it's worth, I've had a good success rate at politely asking to be given an equivalent laptop I can put linux on, or provide my own device. I've never had to outright reject an offer due to being required to use a Mac. At worst I get "you'll be responsible for making our dev environment work on your setup".



I selected Mac + iOS devices when a job offered a choice, specifically to try out the option, while personally sticking with Windows and Android.

Now the performance of Mx Macs convinced me to switch, and I'll die on the hill of Android for life



any thing runs Linux,even wsl2 is fine,no macos is the key. and yes it costs the employer about half of the expensive Apple devices that can not even be upgraded, its hardware is as closed as its software.


Employers typically also care about costs like “how hard is it to provision the devices” and “how long is the useful life of this” or “can I repurpose an old machine for someone else”.


Provisioning is a place where Windows laptops win hands down, though.

Pretty much everything going wrong with provisioning involves going extra weird on hw (usually for cheap supplier) and/or pushing weird third party "security" crapware.



On Windows these days, you get WSL, which is actual Linux, kernel and all. There are still some differences with a standalone Linux system, but they are far smaller than macOS, in which not only the kernel is completely different, but the userspace also has many rather prominent differences that you will very quickly run afoul of (like different command line switches for the same commands).

Then there's Docker. Running amd64 containers on Apple silicon is slow for obvious reasons. Running arm64 containers is fast, but the actual environment you will be deploying to is almost certainly amd64, so if you're using that locally for dev & test purposes, you can get some surprises in prod. Windows, of course, will happily run amd64 natively.



> "userspace also has many rather prominent differences ... (like different command line switches for the same commands)."

Very quickly remedied by installing the GNU versions of those commands, ie: "brew install coreutils findutils" (etc)

Then you'll have exactly the same command line switches as on Linux.



> the actual environment you will be deploying to is almost certainly amd64

that’s up to your team of course, but graviton is generally cheaper than x86 instances nowadays and afaik the same is true on google and the other clouds.



Arm is an ISA, not a family of processors. You may expect Apple chips and Graviton to be wildly different, and perform completely different in the same scenario. In fact, most Arm cpus also have specific extensions that are not found in other manufacturers. So yes, while both recognize a base set of instructions, thats about it - expect that everything else is different. I know, amd64 is also technically an ISA, but you have 2 major manufacturers, with very similar and predictable performance characteristics. And even then, sometimes something on AMD behaves quite differently from Intel.

For most devs, doing crud stuff or writing high-level scripting languages, this isn't really a problem. For some devs, working on time-sensitive problems or with strict baseline performance requirements, this is important. For devs developing device drivers, emulation can only get you so far.



> "I don't even know what you mean by mouse-select and paste."

Presumably they mean linux-style text select & paste, which is done by selecting text and then clicking the middle mouse button to paste it (no explicit "copy" command).

macOS doesn't have built-in support for this, but there are some third-party scripts/apps to enable it.

For example: https://github.com/lodestone/macpaste



I have one and hate it with a passion. A MacBook Air bought new in the past 3 years should be able to use Teams (alone) without keeling over. Takes over a minute to launch Outlook.

My 15 year old Sony laptop can do better.

Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.

I avoid using it whenever possible. If people email me, it’d better not be urgent.



Meanwhile here I am, running linux distros and XFCE on everything. My hardware could be a decade old and I probably wouldn't notice.

(In fact I DO have a spare 13 year old laptop hanging around that still gets used for web browsing, mail and stuff. It is not slow.)



Indeed, I have a 15-year-old desktop computer that is still running great on Linux. I upgraded the RAM to the maximum supported by the motherboard, which is 8 GB, and it has gone through three hard drives in its life, but otherwise it is pretty much the same. As a basic web browsing computer, and for light games, it is fantastic.


It also performs pretty well for the particular brand of web development I do, which basically boils down to running VS Code, a browser, and a lot of ssh.

It's fascinating to me how people are still attached to the hardware upgrade cycle as an idea that matters, and yet for a huge chunk of people and scenarios, basically an SSD, 8gb of RAM and an Intel i5 from a decade ago could have been the end of computing history with no real loss to productivity.

I honestly look at people who use Apple or Windows with a bit of pity, because those ecosystems would just give me more stuff to worry about.



Is it an Apple silicon or Intel machine? Intel macs are crazy slow - especially since the most recent few versions of macOS. And especially since developers everywhere have upgraded to an M1 or better.


Yeah it's such a shame how much the performance has been affected by recent macOS. I kept my 2019 Mac Book Pro on Catalina for years because everyone else was complaining... finally upgraded directly to Sonoma and the difference in speed was night and day!


Sounds a bit like my Intel MBP, in particular after they (the company I work for) installed all the lovely bloatware/tracking crap IT thinks we need to be subjected to. Most of the day the machine runs with the fans blasting away.

Still doesn't take a minute to launch Outlook, but I understand your pain.

I keep hoping it will die, because it would be replaced with an M-series MBP and they are way, way, WAY faster than even the best Intel MBP.



> Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.

It is Microsoft. I could rant all day about the dumpster fire that is the "NEW Microsoft Teams (Work or School)"

It's like the perfect shining example of how MS doesn't give a flaming fuck about their end users.



M* has caused nothing but trouble for most mac user engineers I know (read: most engineers I know) who upgraded. Now not only are they building software for a different OS, they're building for a different architecture! They do all of their important compute in Docker, wasting CPU cycles and memory on the VM. All for what: a nice case? nice UI (that pesters you to try Safari)?

It looks like Apple's silicon and software is really good for those doing audio/video. Why people like it for dev is mostly a mystery to me. Though I know a few people who don't really like it but are just intimidated by Linux or just can't handle the small UX differences.



I'm an engineer that has both an apple silicon laptop (mbp, m2) and a linux laptop (arch, thinkpad x1 yoga.) I choose the mac every day of the week and it's not even close. I'm sure it's not great for specific engineering disciplines, but for me (web, rails, sre) it really can't be beat.

The UX differences are absolutely massive. Even after daily-driving that thinkpad for months, Gnome always felt kinda not quite finished. Maybe KDE is better, but it didn't have Wayland support when I was setting that machine up, which made it a non-starter.

The real killer though is battery life. I can work literally all day unplugged on the mbp and finish up with 40-50% remaining. When i'm traveling these days, i don't even bring a power cable with me during the day. The thinkpad, despite my best efforts with powertop, the most aggressive frequency scaling i could get, and a bunch of other little tricks, lasts 2 hours.

There are niceties about Linux too. Package management is better and the docker experience is _way_ better. Overall though, i'd take the apple silicon macbook 10 times out of 10.



Battery life followed by heat and fan noise have been my sticking points with non-mac laptops.

My first gen ThinkPad Nano X1 would be an excellent laptop, if it weren’t for the terrible battery life even in power save mode (which as an aside, slows it down a lot) and its need to spin up a fan to do something as trivial as driving a rather pedestrian 2560x1440 60hz display.

It feels almost like priorities are totally upside down for x86 laptop manufacturers. I totally understand and appreciate that there are performance oriented laptops that aren’t supposed to be good with battery life, but there’s no good reason for there being so few ultraportable and midrange x86 laptops that have good battery life and won’t fry your lap or sound like a jet taking off when pushed a little. It’s an endless sea of mediocrity.



> The thinkpad, […], lasts 2 hours.

This echoes my experiences for anything that needs power management. Not just that the battery life is worse, but that it degrades quickly. In two years it’s barely usable. I’ve seen this with non-Apple phones and laptops. iPhone otoh is so good these days you don’t need to upgrade until EOL of ~6 years (and even if you need it battery is not more expensive than any other proprietary battery). My last MacBook from 2011 failed a couple of years ago only because of a Radeon GPU inside with a known hw error.

> There are niceties about Linux too.

Yes! If you haven’t tried in years, the Linux desktop experience is awesome (at least close enough) for me – a dev who CAN configure stuff if I need to but find it excruciatingly menial if it isn't related to my core work. It’s really an improvement from a decade ago.



I'd like to offer a counterpoint, I have an old'ish T480s which runs linuxmint, several lxd containers for traefik, golang, python, postgres and sqlserver (so not even dockerized, but full VMs running these services), and I can go the whole morning (~4-5 hours).

I think the culprit is more likely the power hungry intel CPU in your yoga?

Going on a slight tangent; I've tried but do not like the mac keyboards, they feel very shallow to me, hence why I'm still using my old T480s. The newer thinkpad laptop keyboards all seem to be going that way though (going thinner), much to my dismay. Perhaps a P14s is my next purchase, despite it's bulk.

Anybody with a framework 13 want to comment on their keyboard?



I really like the keyboards on my frameworks. I have both the 13 and the new 16, and they are pretty good. Not as good as the old T4*0s I'm afraid, but certainly usable.


Interesting. I do similar (lots of Rails) but have pretty much the opposite experience (other than battery life - Mac definitely wins there). Though I use i3/Sway more than Gnome. The performance of running our huge monolith locally is much better for Linux users than Mac users where I work.

I used a Mac for awhile back in 2015 but it never really stood out to me UX-wise, even compared to Gnome. All I really need to do is open a few windows and then switch between them. In i3 or Sway, opening and switching between windows is very fast and I never have to drag stuff around.



Interestingly enough, the trend I am seeing is all the MacBook engineers moving back to native development environments. Basically, no longer using docker. And just as expected, developers are getting bad with docker and are finding it harder to use. They are getting more and more reliant on devops help or to lean on the team member who is on Linux to handle all of that stuff. We were on a really great path for a while there in development where we were getting closer to the ideal of having development more closely resemble production, and to have developers understand the operations tools. Now we're cruising firmly in the opposite direction because of this Apple switch to arm. Mainly it wouldn't bother me so much if people would recognize that they are rationalizing because they like the computers, but they don't. They just try to defend logically a decision they made emotionally. I do it too, every human does, but a little recognition would be nice.


It's not even a problem with MacBooks as such. They are still excellent consumer devices (non-casual gaming aside). It's this weird positioning of them as the ultimate dev laptop that causes so many problems, IMO.


Remember, though, that the binaries deployed in production environments are not being built locally on individual developer machines, but rather in the cloud, as reproducible builds securely deployed from the cloud to the cloud.

Modern language tooling (Go, Rust et al) allows one to build and test on any architecture, and the native macOS virtualization (https://developer.apple.com/documentation/virtualization) provides remarkably better performance compared to Docker (which is a better explanation for its fading from daily use).

Your "trend" may, in fact, not actually reflect the reality of how cloud development works at scale.

And I don't know a single macOS developer that "lean(s) on the team member who is on Linux" to leverage tools that are already present on their local machine. My own development environments are IDENTICAL across all three major platforms.



Virtualization and Docket are orthogonal technologies. The reason you use docker, especially in dev, is to have the exact same system libraries, dependencies, and settings on each build. The reason you use virtualization is to access hardware and kernel features that are not present on your hardware or native OS.

If you deploy on docker (or Kubernetes) on Linux in production, then ideally you should be using docker on your local system as well. Which, for Windows or MacOS users, requires a Linux VM as well.



It seems that you're trying to "educate" me on how containers and virtualization work, when in fact I've been doing this for a while, on macOS, Linux and Windows (itself having its own Hyper-V pitfalls).

I know you mean well, though.

There is no Docker on macOS without a hypervisor layer - period - and a VM, though there are multiple possible container runtimes not named Docker that are suitable for devops-y local development deployments (which will always, of course, be constrained in comparison to the scale of lab / staging / production environments). Some of these can better leverage the Rosetta 2 translation layer that Apple provides, than others.



In my experience as a backend services Go developer (and a bit of Scala) the switch to arm has been mostly seamless. There was a little config at the beginning to pull dual-image docker images (x64 and arm) but that was a one time configuration. Otherwise I'm still targeting Linux/x64 with Go builds and Scala runs on the JVM so it's supported everywhere anyway; they both worked out of the box.

My builds are faster, laptop stays cooler, and battery lasts longer. I love it.

If I was building desktop apps I assume it would be a less pleasant experience like you mention.



I don’t know a single engineer who had issues with M chips, and most engineers I know (me included) benefited considerably from the performance gains, so perhaps your niche isn’t that universal?


I’m doing Ruby on Rails dev too. I don’t notice a hige difference between macOS and Linux for how I work.

There’s quirks to either OS.

Eg when on Gnome it drives me mad that it won’t focus a recently launched apps.

On macOS it annoys me that I have install a 3rd party util to move windows around.

Meh, you just adapt after a while.



You must have an unusual setup because, between Rosetta and rosetta in Virtualization.framework VMs (configurable in Docker Desktop or Rancher Desktop), I’ve never had issues running intel binaries on my Mac


We have to cross-compile anyway because now we're deploying to arm64 Linux (AWS Graviton) in addition to x86 Linux.

So even if all developers of your team are using Linux, unless you want to waste money by ignoring arm64 instances on cloud computing, you'll have to setup cross compilation.



It is, provided that the hardware vendor has reasonably decent support for power management, and you're willing to haul around an AC adapter if not. In general, I really like AMD hardware with built-in graphics for this, or alternately, Intel Tiger Lake-U based hardware.

Asahi Linux is shockingly great on Apple Silicon hardware, though.



>Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.

Most CS professionals who write code have no idea what it takes to build a desktop, so the hardware that they chose is pretty much irrelevant because they aren't specifically choosing for hardware. The reason Apple gets bought is mostly by anyone, including tech people, is because of ecosystem. The truth is, nobody really care that much about actual specs as long as its good enough to do basic stuff, and when you are indifferent to the actual difference but all your friends are in the ecosystem, the choice is obvious.

You can easily see this yourself: ask these "professionals" about the details of the Apple Neural engine, and its a very high chance that they will repeat some marketing material, while failing to mention that Apple does not publish any real docs for ANE, you have to sign your code to run on ANE, and you have to basically use Core ML to utilize the ANE. I.e if they really cared about inference, all of them would be buying laptops with discrete 4090s for almost the same price.

Meanwhile, if you look at people who came from EE/ECE (who btw on the average are far better coders than people with CS background, based on my 500+ interviews in the industry across several sectors), you see a way larger skew towards Android/custom built desktops/windows laptops running Linux. If you lived and breathed Linux and low level OS, you tend appreciate all the power and customization that it gives you because you don't have to go learn how to do things.



Coming from both environments, I'd be wary of making some of these assertions, especially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus. This applies regardless of (RT)OS / hardware choice, i.e., it's simply common sense.

The signing of binaries is a part of adult developer life, and is certainly required for the platforms you mention as well.

Unquestionably, battery life on 4090-based laptops sucks on a good day, and if you're working long hours, the last thing you want to have to do is park yourself next to your 350W adapter just to get basic work done.



I disagree.

Apple is selling hardware and scaling AI by utilizing it is simply a smart move.

Instead of building huge GPU clusters, having to deal with NVIDIA for GOUs (Apple kicked NVIDIA out years ago because of disagreements), Apple is building mainly on existing hardware.

This is in other terms utilizing CPU power.

On the other hand this helps their marketing keeping high price points when Apple now is going to differentiate their COU power and therefore hardware prices over AI functionality correlating with CPU power. This is also consistent with Apple stopping the MHz comparisons years ago.



Seen MLX folks post on X about nice results running local LLMs. https://github.com/ml-explore/mlx

Also, Siri, and consider: you’re scaling AI on apple’s hardware, too, you can develop your own local custom AI on it, there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.

They scale the VRAM capacity with unified memory and that plus a ton of software is enough to make the Apple stuff plenty competitive with the corresponding NVIDIA stuff for the specific task of running big AI models locally.



Not found any good proxy which works well with cisco VPN software. Charles and proxyman work intermittently at best and require disconnecting from the VPN and various such dances.

Fiddler on windows works flawlessly.



Any decent laptop from the same era. My parents are using both HP ProBooks and Lenovo Thinkpads from that era currently and they are working perfectly and maintenance costs are lower than the same era macbooks...

I own a MacBook Air, I won't be buying another purely because the moment I need to upgrade anything or repair anything it's effectively ewaste.



> Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets.

I notice your use of the weasel word "just".

We undoubtedly live in a world where Apple products are sold as lifestyle gadgets. Arguably it's more true today than it ever was. It's also a world where Apple's range of Veblen goods managed to gain footing in social circles to an extent that we have kids being bullied for owning Android phones.

Apple's lifestyle angle is becoming specially relevant because they can no longer claim they sell high-end hardware, as the difference in specs between Apple's hardware and product ranges from other OEMs is no longer noticeable. Apple's laughable insistence on shipping laptops with 8GB of RAM is a good example.

> Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.

I don't think so, and that contrasts with my personal experience. All my previous roles offered a mix of MacBooks and windows laptops, and MacBooks were opted by new arrivals because they were seen as perks and the particular choice of windows ones in comparison were not as impressive, even though they out-specced Apple's offering (mid-range HP and Dell). In fact in a recent employee's review their main feedback was that the MacBook pro line was under-specced because at best it shipped with only 16GB of RAM while the less impressive HP ones already came with 32GB. In previous years, they called for the replacement of the MacBook line due to the rate of keyboard malfunctions. Meaning, engineers were purposely picking the underperforming option for non-technical reasons.



I bought my first Apple product roughly 11 years ago explicitly because it had the best accessibility support at the time (and that is still true). While I realize you only see your slice of the world, I really cringe when I see the weasel-word "lifestyle". This "Apple is for the rich kids"-fairytale is getting really really old.


price-performance is not a thing for a vast majority of users. Sure I'd like a $40k car but I can only afford a $10k car. It's not nice but it gets me from a to b on my min-wage salary. Similarly, I know plenty of friends and family. They can either get 4 macs for $1000 each (mom, dad, sister, brother) so $4k. Or they can get 4 windows PCs for $250 so $1k total.

The cheap Windows PCs suck just like a cheap car sucks (ok, they suck more), but they still get the job done. You can still browse the web, read your email, watch a youtube video, post a youtube video, write a blog, etc.. My dad got some HP celeron. It took 4 minutes to boot. It still ran though and he paid probably $300 for it vs $999 for a mac. He didn't have $999.



I’m not saying one or the other is better for your family members. But MacBooks last very long. We'll see about the M series but for myself for instance I got the M1 air without fans, which has the benefit of no moving pieces or air inlets, so even better. My last one, a MBP from 2011 lasted pretty much 10 years. OS updates are 8-10y.

> The cheap Windows PCs suck […], but they still get the job done

For desktop, totally. Although I would still wipe it with Ubuntu or so because Windows is so horrible these days even my mom is having a shit time with only browsing and video calls.

A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.



Clumsily phrased. What I meant is that iPhones or similar priced smartphones are affordable and common for say middle class in countries with similar purchase power to Eastern European countries. You’d have to go to poorer countries like Vietnam or Indonesia for iPhones to be “out of reach”, given the immense value it provides.

Heck now I see even Vietnam iPhone is #1 vendor with a 28% market penetration according to statcounter. That’s more than I thought, even though I was just there…

Speaking of India, they’re at 4% there. That’s closer to being luxury.



I think US is their main market, though. The rest of the world prefers cheaper better phones and doesn't mind using WhatsApp for messaging, instead of iMessage.


As a single market, US is probably biggest. I’m seeing numbers that say that the “Americas” is a bit less than half of global revenue, and that would include Canada and all of South and Latin America. So the rest of the world is of course very important to Apple, at least financially.

> doesn't mind using WhatsApp for messaging

Well WhatsApp was super early and way ahead of any competition, and the countries where it penetrated had no reason to leave, so it’s not exactly like they settle for less. It has been a consistently great service (in the category of proprietary messaging apps), even after Zuck took over.



It's not about price-performance value at all. Mac is still the most expensive performance. And Apple is only particularly popular in the US. Android phones dominate most other markets, particularly poor markets.

Apple is popular in the US because a) luxury brands hold sway b) they goad customers into bullying non-customers (blue/green chats) and c) they limit features and customizability in favor of simpler interfaces.

It's popular with developers because a) performance is valuable even at Apples steep cost b) it's Unix-based unlike Windows so shares more with the Linux systems most engineers are targeting.



> > In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

> Their primary business goal is to sell hardware.

There is no contradiction here. No need for luxury. Efficient hardware scales, Moore's law has just been rewritten, not defeated.

Power efficiency combined with shared and extremely fast RAM, it is still a formula for success as long as they are able to deliver.

By the way, M-series MacBooks have crossed bargain territory by now compared to WinTel in some specific (but large) niches, e.g. the M2 Air.

They are still technically superior in power efficiency and still competitive in performance in many common uses, be it traditional media decoding and processing, GPU-heavy tasks (including AI), single-core performance...

By the way, this includes web technologies / JS.



This is it. An M series air is an incredible machine for most people - people who likely won’t ever write a line of js or use a GPU. Email, banking, YouTube, etc ona device with incredible battery and hardware that will likely be useful for a decade is perfect. The average user hasn’t even heard of HN.


It's great for power users too. Most developers really enjoy the experience of writing code on Macs. You get a Unix based OS that's just far more usable and polished than a Linux laptop.

If you're into AI, there's objectively literally no other laptop on the planet that is competitive with the GPU memory available on an MBP.



It doesn't need to stay true forever.

The alternative is Google / Android devices and OpenAI wrapper apps, both of which usually offer a half baked UI, poor privacy practices, and a completely broken UX when the internet connection isn't perfect.

Pair this with the completely subpar Android apps, Google dropping support for an app about once a month, and suddenly I'm okay with the lesser of two evils.

I know they aren't running a charity, I even hypothesized that Apple just can't build good services so they pivoted to focusing on this fake "privacy" angle. In the end, iPhones are likely going to be better for edge AI than whatever is out there, so I'm looking forward to this.



> The alternative is Google / Android devices

No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.

You just can't have this on Apple devices. On Android side choices are limited too, I don't like Google and especially their disastrous hardware design, but their Pixel line is the most approachable one able to do all these.

Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?



> No, the alternative is Android devices with everything except firmware built from source and signed by myself

Normal users will not do this. Just because many of the people here can build and sign a custom Android build doesn't mean that is a viable commercial alternative. It is great that is an option for those of us who can do it, but don't present it as a viable alternative to the iOS/Google ecosystems. The fraction of people who can and will be willing to do this is really small. And even if you can do it, how many people will want to maintain their custom built OSes?



> Normal users will not do this. J

Unfortunately a lot of the "freedom" crowd think that unless you want to be an 80s sysadmin you don't deserve security or privacy. Or computers.



the main reason the masses don't have privacy and security-centred systems is that they don't demand them and they will trade it away for a twopence or for the slightest increment in convenience

a maxim that seems to hold true at every level of computing is that users will not care about security unless forced into caring

with privacy they may care more, but they are easily conditioned to assume it's there or that nothing can be realistically be done about losing it



I, an engineer, am not doing this myself, too. There is a middle ground though: just use a privacy-oriented Android build, like DivestOS. [1]

There are a couple caveats:

1. It is still a bit tricky for a non-technical person to install. Should not be a problem if they know somebody who can help, though. There's been some progress making the process more user friendly recently (e.g. WebUSB-based GrapheneOS installer).

2. There are some papercuts if you don't install Google services on your phone. microG [2] helps with most but some still remain. My main concern with this setup is that I can't use Google Pay this way, but having to bring my card with me every time seems like an acceptable trade off to me.

[1]: https://divestos.org/

[2]: https://microg.org/



> No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.

There are people who don't know how to use file explorer, new generation grows up in a world of iPhones without ever seeing file system. Any other bright ideas?



WebGPU and many other features on iOS are unimplemented or implemented in half-assed or downright broken ways.

These features work on all the modern desktop browsers and on Android tho!



> WebGPU and many other features

WebGPU isn't standardized yet. Hell, most of the features people complain about aren't part of any standard, but for some reason there's this sense that if it's in Chrome, it's standard - as if Google dictates standards.



> but for some reason there's this sense that if it's in Chrome, it's standard - as if Google dictates standards.

Realistically, given the market share of Chrome and Chromium based browsers, they kind of do.



I’ve been using Firefox since the Quantum version is out. It feels slightly slower to Chrome but it's negligible to me. Otherwise I can't tell a difference (except some heavy web based Office like solutions screaming 'Your browser is not supported!' but actually works fine).


> How is this acceptable at all?

Because as you described, the only alternatives that exist are terrible experiences for basically everyone, so people are happy to pay to license a solution that solves their problems with minimal fuss.

Any number of people could respond to “use Android devices with everything except firmware built from source and signed by myself” with the same question.



The yearly subscription is for publishing your app on Apple’s store and definitely helps keep some garbage out. Running your own app on your own device is basically solved with free third party solutions now (see AltStore and since a newer method I can’t recall atm)


Notice that parent never talked about publishing apps, just building and running apps on their own device. "Publishing on AltStore" (or permanently running the app on your own device in any other way) still requires a $100/year subscription as far as I'm aware.


> No, the alternative is Android devices with everything except firmware built from source and signed by myself

I wouldn't bet on this long term, since it fully relies on Google hardware, and Google's long-term strategy is to remove your freedom piece by piece, cash on it, not to support it.

The real alternative is GNU/Linux phones, Librem 5 and Pinephone, without any ties to greedy, anti-freedom corporations.



> Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?

Because they set the terms of use of the SDK? You're not required to use it. You aren't required to develop for iOS. Just because Google gives it all away for free doesn't mean Apple has to.



> You aren't required to develop for iOS.

Sure, as a SWE I'm not going to buy a computer unable to run my own code. A smartphone is an ergonomic portable computer, so I say no to iPhone and would like to remind others who didn't have a deep think into this about it.



> You aren't required to develop for iOS

Do you have a legal right to write software or run your own software for hardware you bought?

Because it’s very easy to take away a right by erecting aritificial barriers, just like how you could discriminate by race at work, but pretend you are doing something else,



> Do you have a legal right to write software or run your own software for hardware you bought?

I've never heard of such a thing. Ideally I'd like that, but I don't have such freedoms with the computers in my cars, for example, or the one that operates my furnace, or even for certain parts of my PC.



So you bought "a thing' but you can't control what it does, how it does it, you don't get to decide what data it collects or who can see that data.

You aren't allowed to repair the "thing' because the software can detect you changed something and will refuse to boot. And whenever it suits the manufacturer, they will decide when the 'thing' is declared out of support and stops functioning.

I would say you are not an owner then, you (and me) and just suckers that are paying for the party. Maybe it's a lease. But then we also pay when it breaks, so it more of a digital feudalism.



> Do you have a legal right to write software or run your own software for hardware you bought?

No, obviously not. Do you have a right to run a custom OS on your PS5? Do you have a right to run a custom application on your cable set-top box? Etc. Such a right obviously doesn’t exist and most people generally are somewhere between “don’t care” and actively rejecting it for various reasons (hacking in games, content DRM, etc).

It’s fine if you think there should be, but it continues this weird trend of using apple as a foil for complaining about random other issues that other vendors tend to be just as bad or oftentimes even worse about, simply because they’re a large company with a large group of anti-fans/haters who will readily nod along.

Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?

https://www.theverge.com/2022/5/21/23079058/apple-self-servi...



> Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?

Yes, I do. That was and continues to be a valid complaint, among all other anti-repair schemes Apple have come up with over the years. DRM for parts, complete unavailability of some commonly repaired parts, deliberate kneecapping of "Apple authorized service providers", leveraging the US customs to seize shipments of legitimate and/or unlabeled replacement parts as "counterfeits", gaslighting by official representatives on Apple's own forums about data recovery, sabotaging right to repair laws, and even denial of design issues[1] to weasel out of warranty repair just to name a few.

All with the simple anti-competitive goal of making third party repair (both authorized and independent) a less attractive option due to artificially increased prices, timelines to repair, or scaremongering about privacy.

https://arstechnica.com/gadgets/2022/12/weakened-right-to-re...

https://www.pcgamer.com/ifixit-says-apples-iphone-14-is-lite...

[1] Butterfly keyboards, display cables that were too short and failed over time



Google has also been working on (and provides kits for) local machine learning on mobile devices... and they run on both iOS and Android. The Gemini App does send data in to Google for learning, but even that you can opt out of.

Apple's definitely pulling a "Heinz" move with privacy, and it is true that they're doing a better job of it overall, but Google's not completely horrible either.



> better for edge AI than whatever is out there, so I'm looking forward to this

What exactly are you expecting? The current hype for AI is large language models. The word 'large' has a certain meaning in that context. Much larger that can fit on your phone. Everyone is going crazy about edge AI, what am I missing?



> Everyone is going crazy about edge AI, what am I missing?

If you clone a model and then bake in a more expensive model's correct/appropriate responses to your queries, you now have the functionality of the expensive model in your clone. For your specific use case.

The size of the resulting case-specific models are small enough to run on all kinds of hardware, so everyone's seeing how much work can be done on their laptop right now. One incentive for doing so is that your approaches to problems are constrained by the cost and security of the Q&A roundtrip.



Quantized LLMs can run on a phone, like Gemini Nano or OpenLLAMA 3B. If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.


Using RAG a smaller local LLM combined with local data (e.g. your emails, iMessages etc) can be useful than a large external LLM that doesn’t have your data.

No point asking GPT4 “what time does John’s party start?”, but a local LLM can do better.



This is why I think Apple’s implementation of LLMs is going to be a big deal, even if it’s not technically as capable. Just making Siri better able to converse (e.g. ask clarifying questions) and giving it the context offered by user data will make it dramatically more useful than silo’d off remote LLMs.


> If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.

Distributed mixture of experts sounds like an idea. Is anyone doing that?



Sounds like an attack vector waiting to happen if you deploy enough competing expert devices into a crowd.

I’m imagining a lot of these LLM products on phones will be used for live translation. Imagine a large crowd event of folks utilizing live AI translation services being told completely false translations because an actor deployed a 51% attack.



I’m not particularly scared of a 51% attack between the devices attached to my Apple ID. If my iPhone splits inference work with my idle MacBook, Apple TV, and iPad, what’s the problem there?


>subpar Android apps

Care to cite these subpar Android apps? The app store is filled to the brim with subpar and garbage apps.

>Google dropping support for an app about once a month

I mean if you're going to lie why not go bigger

>I'm okay with the lesser of two evils.

So the more evil company is the one that pulled out of China because they refused to hand over their users data to the Chinese government on a fiber optic silver plate?



Google operates in China albeit via their HK domain.

They also had project DragonFly if you remember.

The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.

Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/

As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.



>Google operates in China albeit via their HK domain.

The Chinese government has access to the iCloud account of every Chinese Apple user.

>They also had project DragonFly if you remember.

Which never materialized.

>The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.

Apple does targeted and non targeted advertising as well. Additionally, your carrier has likely sold all of the data they have on you. Apple was also sued for selling user data to ad networks. Odd for a Privacy First company to engage in things like that.

>Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/

Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.

>As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.

Not sure how that has anything to do with app quality, but if network traffic is your concern there's probably a lot more an Android user can do than an iOS user to control or eliminate the traffic.



> Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.

Most of the "Services" on that list are effectively apps, too:

VPN by Google One, Album Archive, Hangouts, all the way back to Answers, Writely, and Deskbar.

I didn't touch hardware, because I think that should be considered separately.

The first of 211 services on that site was killed in 2006.

The first of the 60 apps on that site was killed in 2012.

So even apps alone, 4.28 a year.

But more inclusively, 271 apps or services in 17 years is ~16/year, over one a month.

You need to remind yourself of the site guidelines about assuming the worst. Your comments just come across condescendingly.



>Most of the "Services" on that list are effectively apps, too:

Even with the additional apps you've selected it still doesn't come close to the one app per month claim.

>I didn't touch hardware, because I think that should be considered separately.

So why even mention it? Is Apple impervious to discontinuing hardware?

>The first of 211 services on that site was killed in 2006.

So we're talking about services now? Or apps? Or apps and services? The goal posts keep moving.

>You need to remind yourself of the site guidelines about assuming the worst. Your comments just come across condescendingly.

I suggest you also consult the guidelines in regards to calling people names. My comments were never intended to be inferred that way.



I think it was Paul Thurrott on Windows Weekly podcast who said that all these companies don't really care about privacy. Apple takes billions of dollar a year to direct data towards Google via the search defaults. Clearly privacy has a price. And I suspect it will only get worse with time as they keep chasing the next quarter.

Tim Cook unfortunately is so captured in that quarterly mindset of 'please the share holders' that it is only a matter of time.



It doesn’t matter to me if they “really care” about privacy or not. Megacorps don’t “really care” about anything except money.

What matters to me is that they continue to see privacy as something they can sell in order to make money.



Yeah, some poor phrasing on my behalf.

I do hope that those working in these companies actually building the tools do care. But unfortunately, it seems that corruption is an emergent property of complexity.



The Google payments are an interesting one; I don't think it's a simple "Google pays them to prefer them", but a "Google pays them to stop them from building a competitor".

Apple is in the position to build a competing search product, but the amount Google pays is the amount of money they would have to earn from it, and that is improbable even if it means they can set their own search engine as default.



> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.

Indeed.

Privacy starts with architectural fundamentals that are very difficult to retrofit...

If a supplier of products has not built the products this way, it would be naive to bet bank or farm on the supplier. Even if there were profound motivation to retrofit.

Add to this the general tendency of the market to exploit its customers.



> but it is about selling luxury hardware.

While Apple is first and foremost a hardware company, it has more or less always been about the "Apple experience". They've never "just" been a hardware company.

For as long as Apple has existed, they've done things "their way" both with hardware and software, though they tend to want to abstract the software away.

If it was merely a question of selling hardware, why does iCloud exist ? or AppleTV+, or Handoff ? or iMessage, or the countless other seemingly small life improvements that somehow the remainder of the industry cannot seem to figure out how to do well.

Just a "simple" thing as switching headphones seamlessly between devices is something i no longer think about, it just happens, and it takes a trip with a Windows computer and a regular bluetooth headset to remind me how things used to be.

As part of their "privacy first" strategy, iMessage also fits in nicely. Apple doesn't have to operate a huge instant messaging network, which undoubtedly is not making a profit, but they do, because having one entry to secure, encrypted communication fits well with the Apple Experience. iMessage did so well at abstracting the ugly details of encryption that few people even think about that that's what the blue bubble is actually about, it more or less only means your message is end to end encrypted. As a side effect you can also send full resolution images (and more), but that's in no way unique to iMessage.



I can't buy a MacBook Air for less than $999, and that's for a model with 8GB RAM, an 8-core CPU and 256GB SSD. The equivalent (based on raw specs) in the PC world runs for $300 to $500.

How is something that is twice as expensive as the competition not a luxury device?

EDIT: Because there's repeated confusion in the replies: I am not saying that a MacBook Air is not objectively a better device. I'm saying it is better by metrics that fall strictly into the "luxury" category.

Better build quality, system-on-a-chip, better OS, better battery life, aluminum case—all of these are luxury characteristics that someone who is looking for a functional device that meets their needs at a decent price won't have as dealbreakers.



> How is something that is twice as expensive as the competition not a luxury device?

You can buy a version of from Walmart at 1/2 price of a "normal" retailer. Does that mean every "normal" retailer is actually a luxury goods dealer?

Is my diner a luxury restaurant because a burger costs twice as much as McDonald's?

Stop the silliness.



> You can buy a version of from Walmart at 1/2 price of a "normal" retailer. Does that mean every "normal" retailer is actually a luxury goods dealer?

What percent of that retailer's products does that comparison apply to?

If it's more than half then yeah that's probably a luxury goods dealer.



> The equivalent (based on raw specs) in the PC world runs for $300 to $500.

Equivalent device?! Find me Windows laptop in ANY price category that can match weight, fanless design, screen quality, battery life, speakers quality and battery life of Air.



Weight

> 3.92 lb

Battery life

> 6.5 h

Probably half of that.

Fans that sound like jet engine, screen quality which would force me to stab my eyes, speakers sounding worse than a toilet, plastic build.

I'm not convinced.



That's more like cheap vs middle of the road. There is no luxury space in laptops - displays, iPads, and workstations maybe but that's it (and those are more pro than luxury).

$999 amortized over 3 years is $30/mo which is less than what even middle class people spend on coffee.



Oh dear. 16:10 screen with superior resolution, brightness and gamut - and it still gets superior battery life driving all those pixels.. that’s a headline feature that even a non-propellerhead can observe (I was honestly surprised when I looked up that Acer screen what a dim, narrow piece of shit it is) - notably there are ballpark priced systems with better screens.

I think you unjustifiably downplay how much of a selling point a screen that looks great (or at least decent) on the floor is. And I know tons of devs that put up with the 45% NTSC abominations on Thinkpads that aren’t even suitable for casual photo editing or web media, just because you make do with that doesn’t automatically make a halfway decent display on a laptop a “luxury”.

Sorry, but don’t buy the “everything that isn’t a $300 econo shit laptop is luxury” thesis repeated ad nauseum.



"Luxury" often includes some amount of pure status symbols added to the package, and often on what is actually a sub-par experience. The quintessential luxury tech device were the Vertu phones from just before and even early in the smartphone era - mid-range phones tech and build quality-wise, with encrusted gems and gold inserts and other such bling, sold at several thousand dollars (Edit: they actually ranged between a few thousand dollars all the way to 50,000+).

But the definition of luxury varies a lot by product category. Still, high-end and luxury are separate concepts, which ven when they do overlap.



You just made up the "sub-par experience" as a defining point of a luxury product. A luxury product is defined by being a status symbol (check for all Apple devices) and especially by its price. A luxury car like a Bentley will still you bring from point A to point B like the cheapest Toyota.


I doubt I am alone in saying that I would gladly pay twice the price to avoid having to use Windows. It's the most user-hostile, hand-holdy, second-guess-and-confirm-my-explicit-command-ey os I've used to date. And bloatware baked in? No thanks.


You're probably right. I am in the middle-class, maybe lower middle-class, and I live in the US. I have advantages and opportunities that many in other circumstances do not and I am sincerely grateful for them.


The Walmart variant was introduced 6 weeks ago to offload excess stocks of a four year old discontinued model. I'm not sure your argument of "at only 70% of the price of a model two generations newer" is the sales pitch you think it is.


The same thing can be (and is!) said about luxury car brands. That's what makes the MacBook Air a luxury item.

Most people, when given the pitch you just gave me for a 2x increase in price, will choose the cheaper item, just like they choose the cheaper car.



They’re tools. This attempt to treat them as luxury goods doesn’t hold with those. It’s entirely common for even people who want to do some home repair—let alone professionals—but aren’t clueless about DIY to spend 2x the cheapest option, because they know the cheapest one is actually worth $0. More will advocate spending way more than 2x, as long as you’re 100% sure you’re going to use it a lot (like, say, a phone or laptop, even for a lot of non-computer-geeks). This is true even if they’re just buying a simple lowish-power impact driver, nothing fancy, not the most powerful one, not the one with the most features. Still, they’ll often not go for the cheapest one, because those are generally not even fit for their intended purpose.

[edit] I mean sure there are people who just want the Apple logo, I’m not saying there are zero of those, but they’re also excellent, reliable tools (by the standards of computers—so, still bad) and a good chunk of their buyers are there for that. Even the ones who only have a phone.



I didn't go for the cheapest option: I'm typing this on a laptop that I bought a few months ago for $1200. It has an aluminum case, 32GB RAM, an AMD Ryzen CPU that benchmarks similar to the M3, and 1TB SSD. I can open it up and replace parts with ease.

The equivalent from Apple would currently run me $3200. If I'm willing to compromise to 24GB of RAM I can get one for $2200.

What makes an Apple device a luxury item isn't that it's more expensive, it's that no matter what specs you pick it will always be much more expensive than equivalent specs from a non-luxury provider. The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people.

Note that there's nothing wrong with buying a luxury item! It's entirely unsurprising that most people on HN looking at the latest M4 chip prefer luxury computers, and that's fine!



> It has an aluminum case, 32GB RAM, an AMD Ryzen CPU that benchmarks similar to the M3, and 1TB SSD.

How much does it weight? Battery life? Screen quality? Keyboard? Speakers?



> no matter what specs you pick it will always be much more expensive than equivalent specs from a non-luxury provider

On the phone side, I guess you would call Samsung and Google luxury providers? On the laptop side there are a number of differentiating features that are of general interest.

> The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people

Things that might matter to regular people (and tool users):

- design and build for something you use all day

- mic and speakers that don't sound like garbage (very noticeable and relevant in the zoom/hybrid work era)

- excellent display

- excellent battery life

- seamless integration with iPhone, iPad, AirPods

- whole widget: fewer headaches vs. Windows (ymmv); better app consistency vs. Linux

- in-person service/support at Apple stores

It's hard to argue that Apple didn't reset expectations for laptop battery life (and fanless performance) with the M1 MacBook Air. If Ryzen has caught up, then competition is a good thing for all of us (maybe not intel though...) In general Apple isn't bleeding edge, but they innovate with high quality, very usable implementations (wi-fi (1999), gigabit ethernet (2001), modern MacBook Pro design (2001), "air"/ultrabook form factors (2008), thunderbolt (2011), "retina" display and standard ssd (2012), usb-c (2016), M1: SoC/SiP/unified memory/ARM/asymmetric cores/neural engine/power efficiency/battery life (2020) ...and occasionally with dubious features like the touchbar and butterfly keyboard (2016).)



Huh. Most of the folks I know on Apple stuff started out PC (and sometimes Android—I did) and maybe even made fun of Apple devices for a while, but switched after exposure to them because they turned out to be far, far better tools. And not even much more expensive, if at all, for TCO, given the longevity and resale value.


Eh, I have to use a MacBook Pro for work because of IT rules and I'm still not sold. Might be because I'm a Linux person who absolutely must have a fully customizable environment, but MacOS always feels so limited.

The devices are great and feel great. Definitely high quality (arguably, luxury!). The OS leaves a lot to be desired for me.



I spent about a decade before switching using Linux as my main :-) Mostly Gentoo and Ubuntu (man, it was good in the first few releases)

Got a job in dual-platform mobile dev and was issued a MacBook. Exposure to dozens of phones and tablets from both ecosystem. I was converted within a year.

(I barely customize anything these days, fwiw—hit the toggle for “caps as an extra ctrl”, brew install spectacle, done. Used to have opinions about my graphical login manager, use custom icon sets, all that stuff)



Also, those things aren't even true about Apple devices. Apple fanboys have been convinced that their hardware really is way better than everything else for decades. It has never been true and still isn't.


What metrics are you using for build quality? Admittedly I don't know a ton of mac people (I'm an engineer working in manufacturing) but the mac people I know, stuff always breaks, but they're bragging about how apple took care of it for free.


Clean os install? You haven't used windows in a while have you?

Im a Linux guy but am forced to use Mac's and windows every now and then.

Windows has outpaced macos for a decade straight.

Macos looks like it hasn't been updated in years. It's constantly bugging me for passwords for random things. It is objectively the worst OS. I'd rather work on a Chromebook.



I think he has different critera on what bothers him, thats okay though isn't it. I get a little annoyed at anything where I have to use a touchpad, not enough to rant about it, but it definitely increases friction (haha) in my thought process.


My last Acer lasted me six years until I decided to replace it for more power (which, notably, I would have done with a MacBook by then too). They're not as well built as a MacBook, but they're well built enough for the average laptop turnover rate.


If it was actually bad value they wouldn't sell as high as they do and review with as much consumer satisfaction as they do.

These products may not offer you much value and you don't have to buy them. Clearly plenty of people and institutions bought them because they believed they offered the best value to them.



Agreed. I'd definitely make the same arguments here as I would for an Audi. There's clearly a market, and that means they're not a bad value for a certain type of person.


If people were actually rational that might be true, but they aren't. Apple survives entirely on the fact that they have convinced people they are cool, not because they actually provide good value.


When you're breaking out SSD speeds you're definitely getting into the "luxury" territory.

As I said in another comment:

The point isn't that the MacBook Air isn't better by some metrics than PC laptops. A Rolls-Royce is "better" by certain metrics than a Toyota, too. What makes a device luxury is if it costs substantially more than competing products that the average person would consider a valid replacement.



They're average. A 512GB M3 MBA gets like 3000MBps for read/write. A 1TB Samsung 990 Pro, which costs less than the upgrade from 256GB to 512GB on the Air is over twice as fast. And on base models Apple skimps and speeds are slower.


When I bought my cheesegrater Mac Pro, I wanted 8TB of SSD.

Except Apple wanted $3,000 for 7TB of SSD (considering the base price already included 1TB).

Instead, I bought a 4xM.2 PCI card, and 4 2TB Samsung Pro SSDs.

I paid $1,300 for it, got to keep the 1TB "system" SSD.

And I get faster speeds from it, 6.8GBps versus 5.5GBps off the system drive.

For $2,000 I could have got the PCI 4.0 version and SSDs, and get 26GBps.



Not technically true. The Mac Pro 2023 has 6 PCI slots...

... for an eye watering $3,000 over the exact same spec Mac Studio.

I liked my cheesegrater, though I didn't like the heat output.

And I cannot justify that. I sacrificed half the throughput (2800MBps) for $379 and got an external 4 x M.2 TB3 enclosure.

Oh, and a USB 3 hub to replace one I had installed in the cheesegrater to augment the built in ports. $400 give or take.



Yes, but having all three of those things (well, specs/performance is probably just one thing, but treating them as separate as you did means that I don't have to do the heavy lifting of figuring out what a third thing would actually be) IS, in fact, a luxury.

Nobody is away from a power source for longer than 18 hours. MOST people don't need the performance that a macbook air has, their NEEDS would be met by a raspberry pi... that is, basic finances, logging into various services, online banking, things that first world citizens "rely" on.

The definition of luxury is "great comfort and extravagance", and every current Apple product fits that definition. Past Apple definitely had non-luxury products, as recently as the iPhone C (discontinued 10 years ago)... but Apple has eliminated all low-value options from their lineup.



Good question, I think the answer is even at thousands a window device battery can't hit 18 hour specs. Can someone name a windows device even at 2k+ that acts like an M chip? In fact the pricier windows usually mean GPU and those have worse battery then cheap windows(my 4090 is an hour or so off charge)


Thinkpad X250, admittedly at max specs, did 21 hours in 2018. My T470 from 2020 did over 27 hours at max charge.

M-series Macs is when MacBooks stopped sucking at battery life without sleeping and wrecking state every moment they could.



What the point of comparison? Isn't 18 hour battery and Genius Bar part of the "luxury"?

Like I say Audi is a luxury car because a Toyota costs less than half as much, and you ask "what about a Toyota with leather seats"?



I am all in on Apple, to be clear. Mac Pros, multiple MBPs, Studio, Pro Display XDR, multiple Watches, phones, iPad Pro.

My experiences (multiple) with Genius Bar have been decidedly more "meh" to outright frustrating, versus "luxury", oftentimes where I know more than the Genius.

Logic Board issues where on a brand new macOS install I could reproducibly cause a kernel panic around graphics hardware. There was an open recall (finally, after waiting MONTHS) on this. It covered my Mac. But because it passed their diagnostic tool, they would only offer to replace the board on a time and materials basis.

I had a screen delamination issue. "It's not that bad - you can't see it when the screen is on, and you have to look for it". Huh. Great "luxury" experience.

And then the multiple "we are going to price this so outrageously, and use that as an excuse to try to upsell". Like the MBA that wouldn't charge due to a circuit issue. Battery fine, healthy. Laptop, fine, healthy, on AC. Just couldn't deliver current to the battery. Me, thinking sure, $300ish maybe with a little effort.

"That's going to be $899 to repair. That's only $100 less than a new MBA, maybe we should take a look at some of the new models?" Uh, no. I'm not paying $900 for a laptop that spends 99% (well, 100% now) of its life on AC power.



Really? You can find a laptop with the equivalent of Apple Silicon for $3-500? And while I haven't used Windows in ages I doubt it runs as well with 8 GB as MacOS does.


what you’re arguing is that a product that meets the basic criteria of a good product makes it luxury. That seems pretty wild to me.

No one calls a Toyota Camry with base options luxury but it works well for a long time and has good quality.



My Acer Aspire lasted me for tens of thousands of hours of use and abuse by small children over 6 years until I replaced it this year because I finally felt like I wanted more power. That's the Toyota Camry of laptops.

The features that Apple adds on top of that are strictly optional. You can very much prefer them and think that they're essential, but that doesn't make it so. Some people feel that way about leather seats.



Curious what criteria you're using for using for qualifying luxury. It seems to me that materials, software, and design are all on par with other more expansive Apple products. The main difference is the chipset which I would argue is on an equal quality level as the pro chips but designed for a less power hungry audience.


Maybe for you, but I still see sales guys who refuse working on WinTel where basically what the do is browse internet and do spreadsheets - so mainly just because they would not look cool compared to other sales guys rocking MacBooks.


I provide laptops to people from time to time. They expect to get a MacBook even if company is Windows and they don’t have any real arguments.


I'm not sure what you're point is. My point (which I failed at), is that Apple's incentives are changing because their growth is dependent on services and extracting fees so they will likely do things that try to make people dependent on those services and find more ways to charge fees (to users and developers).

Providing services is arguably at odds with privacy since a service with access to all the data can provide a better service than one without so there will be a tension between trying to provide the best services, fueling their growth, and privacy.



I apologize for being oblique and kind of snarky.

My point was that it's interesting how we can frame a service business "extracting fees" to imply wrongdoing. When it's pretty normal for all services to charge ongoing fees for ongoing delivery.



It’s about the money, it’s about perverse incentives and propensity of service businesses to get away with unfair practices. We have decent laws about your rights as a consumer when you buy stuff, but like no regulation of services


There is tons of regulation of services? Everything from fraud / false advertising to disclosure of fees to length and terms of contracts. What regulation do you think is missing?

And as someone who presumably provides services for a living, what additional regulations would you like to be subject to?



So the new iPad & M4 was just some weekend project that they shrugged and decided to toss over to their physical retail store locations to see if anyone still bought physical goods eh


> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.

Everybody so quick to forget Apple was/is part of PRISM like any other company.



> The promise of privacy

I have very little faith in apple in this respect.

For clarity, just install little snitch on your machine, and watch what happens with your system. Even without being signed in with an apple id and everything turned off, apple phones home all the time.



>The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.

They failed with their ad-business so this is a nice pivot. I'll take it, I'm not usually a cheerleader for Apple, but I'll support anyone who can erode Google's surveillance dominance.



Every company is selling one thing or another, and nothing is going to last forever. I really fail to see what, except for generic negativity, your comment adds to anything.


As a privacy professional for many, many years this is 100% correct. Apple wouldn’t be taking billions from Google for driving users to their ad tracking system, they wouldn’t give the CCP access to all Chinese user data (and maybe beyond), and they wouldn’t be on-again-off-again flirting with tailored ads in Apple News if privacy was a “human right”.

(FWIW my opinion is it is a human right, I just think Tim Cook is full of shit.)

What Apple calls privacy more often than not is just putting lipstick on the pig that is their anticompetitive walled garden.

Pretty much everybody in SV who works in privacy rolls their eyes at Apple. They talk a big game but they are as full of shit as Meta and Google - and there’s receipts to prove it thanks to this DoJ case.

Apple want to sell high end hardware. On-device computation is a better user experience, hands down.

That said, Siri is utter dogshit so on-device dogshit is just faster dogshit.



At this point call your government representatives and ask for new laws, or if you live someplace with laws, actual enforcement (looking at you EU).

The idea that user behavior or consumer choice will change any of this is basically discredited in practice. It will always been cat and mouse until the point that CEOs go to jail, then it will stop.



If you average out it is quite not expensive. Unlike android phone or let us not talk about android tablet.

The important is that a good starting ai learning platform is what … most apple does not touch those price.

Hence with privacy it is a good path.

You do not want communism even if it is not expensive in the short term.



Nothing is true forever. Google wasn’t evil forever, Apple won’t value privacy forever.

Until we figure out how to have guarantees of forever, the best we can realistically do is evaluate companies and their products by their behavior now weighted by their behavior in the past.



As soon as the privacy thing goes away, I'd say a major part of their customer base goes away too. Most people use android so they don't get "hacked" if Apple is doing the hacking, I'd just buy a cheaper alternative.


Maybe true for a lot of the HN population, but my teenagers are mortified by the idea of me giving them android phones because then they would be the pariahs turning group messages from blue to green.


And just to elaborate on this: it's not just snobbery about the color of the texts, for people who rely on iMessage as their primary communication platform it really is a severely degraded experience texting with someone who uses Android. We Android users have long since adapted to it by just avoiding SMS/MMS in favor of other platforms, but iPhone users are accustomed to just being able to send a video in iMessage and have it be decent quality when viewed.

Source: I'm an Android user with a lot of iPhones on my in-laws side.



I’m in Europe and everyone uses WhatsApp, and while Android does gave higher share over here, iPhone still dominate the younger demographics. I’m not denying blue/green is a factor in the US but it’s not even a thing here. It’s nowhere near the only it even a dominant reason iPhones are successful with young people.


Interesting that some people would take that as an Apple problem and others would take it as a Google problem

Who’s at fault for not having built-in messaging that works with rich text, photos, videos, etc?

Google has abandoned more messaging products than I can remember while Apple focused on literally the main function of a phone in the 21st century. And they get shit for it



Apple get shit for it because they made it a proprietary protocol for which clients are not available on anything except their own hardware. The whole point of messaging is that it should work with all my contacts, not just those who drank the Apple-flavored Kool-Aid.


At least here in Brazil, I've never heard such arguments.

Seems even more unlikely for non technical users.

It's just their latest market campaign, as far as I can tell. The vast majority of people buy iPhones because of the status it gives.



My take is that it's like a fashion accessory. People buy Gucci for the brand, not the material or comfort.

Rich people ask for the latest most expensive iPhone even if they're only going to use WhatsApp and Instagram on it. It's not because of privacy or functionality, it's simply to show off to everyone they can purchase it. Also to not stand out within their peers as the only one without it.

As another content said: it's not an argument, it's a fact here.



I have an iPhone so I guess I qualify as a rich person by your definition. I am also a software engineer. I cannot state enough how bogus that statement is. I've used both iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI which maintains its consistency both throughout the OS and over the years. They're the most dumbed down phones and easiest to understand. I recommend iPhone to all my friends and relatives.

There's obviously tons of people who see iPhone as a status item. They're right, because iPhone is expensive and only the rich can buy them. This doesn't mean iPhone is not the best option out there for a person who doesn't want to extensively customize his phone and just use it.



> iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI

It’s not about if you’ve used android, it’s about if you’ve beeen poor-ish or stingy

To some people those are luxuries- the most expensive phone they buy is a mid-range Motorola for $300 with snapdragon 750g or whatever. They run all the same apps after all, they take photos.

iPhones are simply outside of your budget.



Its not an argument, just ask why people lust after the latest iPhones in poor countries. They do it because they see rich people owning them. Unless you experience that, you won't really understand it.


The cheapest point of entry is absolutely not comparable. The cheapest new iPhone on apple.com is $429. The cheapest new Samsung on samsung.com is $199 (They do have a phone listed for $159, but it's button says "Notify Me").

Granted, you may have been leaning very heavily on the dictionary definition of "comparable", in that the two numbers are able to be compared. However, when the conclusion of that comparison is "More than twice the price", I think you should lead with that.

Keep in mind, the iPhone SE is using a 3 year old processor, the Samsung A15 was released 5 months ago with a brand new processor.



I was thinking about this for a while, the problem is not about apple, it’s the fact that the rest of the industry is gutless, and has zero vision or leadership. Whatever Apple does, the rest of the industry will follow or oppose - but will be defined by it.

It’s like how people who don’t like US and want nothing to do with US still discuss US politics, because it has so much effect everywhere.

(Ironically no enough people discuss China in any coherent level of understanding)



You're absolutely right, I'm so glad that Apple was the first company to release a phone with a touch screen, or a phone with an app store, or a smart watch or a VR headset.

Apple doesn't release new products, they wait until the actual brave and innovating companies have done the exploration and then capitalize on all of their learnings. Because they are never the first movers and they have mountains of cash, they're able to enter the market without the baggage of early adopters. They don't have to worry about maintaining their early prototypes.

Apple doesn't innovate or show leadership, they wait until the innovators have proven that the market is big enough to handle Apple, then they swoop in with a product that combines the visions of the companies that were competing.

Apple is great at what they do, don't get me wrong. And swooping in when the market is right is just good business. Just don't mistake that for innovation or leadership.



They famously had a standoff with the US gov't over the Secure Enclave.

Marketing aside, all indications point to the iOS platform being the most secure mobile option (imo).



This is a prejudiced take. Running AI tasks locally on the device definitely is a giant improvement for the user experience.

But not only that, Apple CPUs are objectively leagues ahead of their competition in the mobile space. I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance. Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.

If I cared about status, I would have changed my phone already for a new one.



> I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance.

My Pixel 4a here is also going strong, only the battery is slowly getting worse. I mean, it's 2024, do phones really still get slow? The 4a is now past android updates, but that was promised after 3 years. But at 350 bucks, it was like 40% less than the cheapest iPhone mini at that time.



> I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance.

Only because Apple lost a lawsuit otherwise they'd have kept intentionally slowing it down.



>Apple CPUs are objectively leagues ahead of their competition in the mobile space

This is a lie. The latest Android SoCs are just as powerful as the A series.

>Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.

Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.



> Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.

On the second part:

https://en.wikipedia.org/wiki/IPadOS_version_history

The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).

Apple has not, that I've seen at least, ever established a long term support policy on iPhones and iPads, but the numbers show they're doing at least as well as what Samsung and Google are promising to do, but have not yet done. And they've been doing this for more than a decade now.

EDIT:

Reworked the iOS numbers a bit, down to the month (I was looking at years above and rounding, so this is more accurate). iOS support time by device for devices that cannot use the current iOS 17 (so the XS and above are not counted here) in months:

  1st - 32
  3G  - 37
  3GS - 56
  4   - 48
  4S  - 93
  5   - 81
  5C  - 69
  5S  - 112
  6   - 100
  6S  - 102
  SE  - 96
  7   - 90
  8   - 78
  X   - 76
The average is 72.5 months, just over 6 years. If we knock out the first 2 phones (both have somewhat justifiable short support periods, massive hardware changes between each and their successor) the average jumps to just shy of 79 months, or about 6.5 years.

The 8 and X look like regressions, but their last updates were just 2 months ago (March 21, 2024) so still a good chance their support period will increase and exceed the 7 year mark like every model since the 5S. We'll have to see if they get any more updates in November 2024 or later to see if they can hit the 7 year mark.



>The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).

These are very disingenuous numbers that don't tell the complete story. An iPhone 7 getting a single critical security patch does not take into account the hundreds of security patches it did not receive when it stopped receiving support. It received that special update because Apple likely was told or discovered it was being exploited in the wild.

Google and Samsung now offer 7 years of OS upgrades and 84 months of full security patches. Selectively patching a phone that is out of the support window with a single security patch does not automatically increase its EOL support date.



I look forward to these vendors delivering on their promises, and I look forward to Apple perhaps formalizing a promise with less variability for future products.

Neither of these hopes retroactively invalidates the fact that Apple has had a much better track record of supporting old phone models up to this point. Even if you do split hairs about the level of patching some models got in their later years, they still got full iOS updates for years longer than most Android phones got any patches at all, regardless of severity.

This is not an argument that somehow puts Android on top, at best it adds nuance to just how much better iOS support has been up to this point.

Let's also not forget that if Apple wasn't putting this kind of pressure on Google, they wouldn't have even made the promise to begin with, because it's clear how long they actually care to support products with no outside pressure.



I agree. This is the type of competition I like to see between these two companies. In the end the consumer wins regardless of which one you buy. Google has also promised 10 years of Chromebook support, so they've clearly got the message on the importance of supporting hardware much longer than a lot of people would use them for.


They made that pledge for the Pixel 8 (2023). Let's revisit this in 2030 and see what the nature of their support is at that point and how it compares to Apple's support for iPhone devices. We can't make a real comparison since they haven't done anything yet, only made promises.

What we can do today is note that Apple never made a promise, but did provide very long security support for their devices despite that. They've already met or come close to the Samsung/Google pledge (for one device) on almost half their devices, and those are all the recent ones (so it's not a downward trend of good support then bad support, but rather mediocre/bad support to improving and increasingly good support).

Another fun one:

iPhone XS was released in September 2018, it is on the current iOS 17 release. In the absolute worst case of it losing iOS 18 support in September, it will have received 6 full years of support in both security and OS updates. It'll still hit 7 years (comfortably) of security updates. If it does get iOS 18 support in September, then Apple will hit the Samsung/Google pledge 5 years before Samsung/Google can even demonstrate their ability to follow through (Samsung has a chance, but Google has no history of commitment).

I have time to kill before training for a century ride:

Let's ignore everything before iPhone 4S, they had short support periods that's just a fact and hardly worth investigating. This is an analysis of devices released in 2011 and later, when the phones had, mostly, matured as a device so we should be expecting longer support periods. These are the support periods when the phones were able to run the still-current iOS versions, not counting later security updates or minor updates but after the major iOS version had been deprecated. As an example, for the iPhone 4S it had support from 2011-2016. In 2016 its OS, iOS 9, was replaced by iOS 10. Here are the numbers:

  4S       - 5 years
  5        - 5 years
  5C       - 4 years (decreased, 5 hardware but released a year later in a different case)
  5S       - 6 years
  6        - 5 years (decreased, not sure why)
  6S       - 7 years (hey, Apple did it! 2015 release, lost iOS upgrades in 2022)
  SE(1st)  - 5 years (like 5C, 6S hardware but released later)
  7        - 6 years (decreased over 6S, not sure why)
  8        - 6 years
  X        - 6 years
The 6S is a bit of an outlier, hitting 7 years of full support running the current iOS. 5C and SE(1st) both got less total support, but their internals were the same as prior phones and they lost support at the same time as them (this is reasonable, if annoying, and does drag down the average). So Apple has clearly trended towards 6 years of full support, the XS (as noted above) will get at least 6 years of support as of this coming September. We'll have to see if they can get it past the 7 year mark, I know they haven't promised anything but the trend suggests they can.


Sure. They also pledged to support Chromebooks for 10 years. My point being is that I don't think they'll be clawing back their new hardware support windows anytime soon. Their data indicates that these devices were used well beyond their initial support window metrics so it was in their, and their users, best interest to keep them updated as long as they possibly could. 3 years of OS updates and 4 years of security updates was always the weak link in their commitment to security. And this applies to all of their devices including the A series - something I don't see other Android OEM's even matching.

BTW, my daily driver is an iPhone 13 and I was coming from an iPhone X. So I'm well aware of the incredible support Apple provides its phones. Although, I would still like to see an 8+ year promise from them.



I'd say from my experience the average Apple users care less about privacy then the general public. It's a status symbol first and foremost 99% of what people do on their phones is basically identical on both platforms at this point.


Apple only pivoted into the “privacy” branding relatively recently [1] and I don't think that many people came for that reason alone. In any case, most are now trapped into the walled garden and the effort to escape is likely big enough. And there's no escape anyway, since Google will always make Android worse in that regard…

[1] in 2013 they even marketed their “eBeacon” technology as a way for retail stores to monitor and track their customers which…



Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.

Privacy wasn’t really a concern because most people didn’t have the privacy eroding device yet. In the years following the Nexus 5 is where smartphones went into geometric growth and the slow realization of the privacy nightmare became apparent

Imho I was really excited to get a Nexus 4 at the time, just a few short years later the shine wore off and I was horrified at the smartphone enabled future. And I have a 40 year background in computers and understand them better than 99 out of 100 users – if I didn’t see it, I can’t blame them either



> Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.

What a strange statement. I was late to the game with a Nexus S in 2010, and it was really usable.



Define usable. Imho before Nexus 4 everything was crap, Nexus 4 barely was enough (4x1.4 GHz), Nexus 5 (4x2.2GHz) plus software at the time (post-kitkat) was when it was really ready for mainstream


>The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.

Everyone seems to have forgotten about the Celebrity iCloud photo leak.



The entire software stack is non-free and closed-source. This means you'd be taking Apple at their word on "privacy". Do you trust Apple? I wouldn't, given their track record.


I think it will be a winning strategy. Lag is a real killer for LLMs.

I think they'll have another LLM on a server (maybe a deal for openai/gemini) that the one on the device can use like ChatGPT uses plugins.

But on device Apple have a gigantic advantage. Rabbit and Humane are good ideas humbled by shitty hardware that runs out of battery, gets too hot, has to connect to the internet to do literally anything.

Apple is in a brilliant position to solve all those things.

I hope they announce something good at WWDC



I run a few models (eg Llama3:8b) on my 2023 MacBook Air, and there is still a fair bit of lag and delay, compared to a hosted (and much larger) model like Gemini. A large source of the lag is the initial loading of the model into RAM. Which an iPhone will surely suffer from.

Humane had lag and they used voice chat which is a bad UX paradigm. VUI is bad because it adds lag to the information within the medium. Listening to preambles and lists are always slower than a human eyes ability to scan a page of text. Their lag is not due to LLMs, which can be much faster than whatever they did.

We should remind ourselves that an iPhone can likely suffer similar battery and heat issues - especially if it’s running models locally.



Humane's lag feels down to just bad software design too, it almost feels like a two stage thing is happening like it's sending your voice or transcription up to the cloud, figuring out where it needs to go to get it done, telling the device to tell you its about to do that then finally doing it. E.g

User: "What is this thing?"

Pin: "I'll have a look what that is" (It feels this response has to come from a server)

Pin: "It's a " (The actual answer)

We're still a bit away from iPhone running anything viable locally, even small models today you can almost feel the chip creaking under the load they're incurring on it and the whole phone begins to choke.



> Lag is a real killer for LLMs.

I'm curious to hear more about this. My experience has been that inference speeds are the #1 cause of delay by orders of magnitude, and I'd assume those won't go down substantially on edge devices because the cloud will be getting faster at approximately the same rate.

Have people outside the US benchmarked OpenAI's response times and found network lag to be a substantial contributor to slowness?



> * when you have a good internet connection

Or at least, a good enough internet connection to send plaintext.

> * when you live in the USA

Even from Australia to USA is just ~300ms of latency for first token and then the whole thing can finish in ~1s. And making that faster doesn't require on-device deployment, it just requires a server in Australia - which is obviously going to be coming if it hasn't already for many providers.



There really isn't enough emphasis on the downsides of server side platforms.

So many of these are only deployed in US and so if you're say in country Australia not only do you have all your traffic going to the US but it will be via slow and intermittent cellular connections.

It makes using services like LLMs unusably slow.

I miss the 90s and having applications and data reside locally.



Even in Australia is the LLM lag to a server noticable?

Generally an LLM seems to take about 3s or more to respond, and the network delay to the US is a couple of hundred milliseconds.

The network delay seems minimal compared to the actual delay of the LLM.



I wonder if BYOE (bring your own electricity) also plays a part in their long term vision? Data centres are expensive in terms of hardware, staffing and energy. Externalising this cost to customers saves money, but also helps to paint a green(washing) narrative. It's more meaningful to more people to say they've cut their energy consumption by x than to say they have a better server obselesence strategy, for example.


From your link:

> "Apple defines high-quality credits as those from projects that are real, additional, measurable, and quantified, with systems in place to avoid double-counting, and that ensure permanence."

Apple then pledged to buy carbon credits from a company called Verra. In 2023, an investigation found that more than 90% of Verra's carbon credits are a sham. Notably, Apple made their pledge after the results of this investigation were known - so much for their greenwashing.

https://www.theguardian.com/environment/2023/jan/18/revealed...



That is an interesting angle to look at it from. If they're gonna keep pushing this they end up with a strong incentive to make the iPhone even more energy efficient, since users have come to expect good/always improving battery life.

At the end of the day, AI workloads in the cloud will always be a lot more compute effective however, meaning lowered combined footprint. However, in the server based model, there is more incentive to pre-compute (waste inference) things to make them appear snappy on device. Analogous would be all that energy spent doing video encoding for YouTube videos that never get watched. Although, it's "idle" resources for budgeting purposes.



I’m not sure it’s that (benched pointed out their carbon commitment) as simple logistics.

Apple doesn’t have to build the data centers. Apple doesn’t have to buy the AI capacity themselves (even if from TSMC for Apple designed chips). Apple doesn’t have to have the personnel for the data centers or the air conditioning. They don’t have to pay for all the network bandwidth.

There are benefits to the user to having the AI run on their own devices in terms of privacy and latency as mentioned by the GP.

But there are also benefits to Apple simply because it means it’s no longer their resources being used up above and beyond electricity.

I keep reading about companies having trouble getting GPUs from the cloud providers and that some crypto networks have pivoted to selling GPU access for AI work as crypto profits fall.

Apple doesn’t have to deal with any of that. They have underused silicon sitting out there ready to light up to make their customers happy (and perhaps interested in buying a faster device).



I agree with everything you said but the TSMC bit. They are quite literally competing with NVidia et al for fab space for customers chips. Sure they get the AI bits built-in to existing products but surely they’re bigger/more expensive to manufacture and commit from TSMC because of it.


If you offload data processing to the end user, then your data center uses less energy on paper. The washing part is that work is still being done and spending energy, just outside of the data center.


Which honestly is still good for the environment to have the work distributed across the entire electricity grid.

That work needs to be done anyways and Apple is doing it in the cleanest way possible. What’s an alternative in your mind, just don’t do the processing? That sounds like making progress towards being green. If you’re making claims of green washing you need to be able to back it up with what alternative would actually be “green”.



I didn't make any claims, I just explained what the parent was saying. There could be multiple ways to make it more green: one being not doing the processing, or another perhaps just optimizing the work being done. But actually, no, you don't need a viable way to be green in order to call greenwashing "greenwashing." It can just be greenwashing, with no alternative that is actually green.


> Which honestly is still good for the environment to have the work distributed across the entire electricity grid.

This doesn't make any sense.

> If you’re making claims of green washing you need to be able to back it up with what alternative would actually be “green”.

Sometimes there isn't an alternative. In which case you don't get to look green, sorry. The person critiquing greenwashing doesn't need to give an alternative, why would that be their job? They're just evaluating whether it's real or fake.

Though in this case using renewable energy can help.



> Which honestly is still good for the environment to have the work distributed across the entire electricity grid.

Sometimes, but parallelization has a cost. The power consumption from 400,000,000 iPhones downloading a 2gb LLM is not negligible, probably more than what you'd consume running it as a REST API on a remote server. Not to mention slower.



Downloading 2gb of anything on my iPhone via wifi from my in-home gigabit fiber barely puts a dent in my battery life let alone much time.

The random ads in most phone games are much worse on my battery life.



Yeah it's a shame that mobile games are shit when console and PC gaming gets taken so serious by comparison. If you want to blame that on developers and not Apple's stupid-ass policies stopping you from emulating real games, be my guest. That's a take I'd love to hear.

Keep downloadin' those ads. This is what Apple wants from you, a helpless and docile revenue stream. Think Different or stay mad.



Blame? Simply saying downloading 2gb isn’t the power consumption hit you seem to think it is.

Not much of a gamer anyway, just an observation when I tried a couple apparently dodgy games.

Not sure why your reply wasn’t related to your original comment. Felt rather knee-jerk reactionary to me instead. Oh well.



It makes sense for desktops but not for devices with batteries. I think Apple should introduce a new device for $5-10k that has 400GB of VRAM that all Macs on the network use for ML.

If you're on battery, you don't want to do LLM inference on a laptop. Hell, you don't really want to do transcription inference for that long - but would be nice not to have to send it to a data center.



The fundamental problem with this strategy is model size. I want all my apps to be privacy first with local models, but there is no way they can share models in any kind of coherent way. Especially when good apps are going to fine tune their models. Every app is going to be 3GB+


>n case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning)

I'm curious: is anyone seriously using apple hardware to train Ai models at the moment? Obviously not the big players, but I imagine it might be a viable option for Ai engineers in smaller, less ambitious companies.



I like to think back to 2011 and paraphrase what people were saying: "Is anyone seriously using gpu hardware to write nl translation software at the moment?"

"No, we should be use cheap commodity abundantly available cpus and orchestrate then behind cloud magic to write our nl translation apps"

or maybe "no we should build purpose built high performance computing hardware to write our nl translation apps"

Or perhaps in the early 70s "is anyone seriously considering personal computer hardware to ...". "no, we should just buy IBM mainframes ..."

I don't know. Im probably super biased. I like the idea of all this training work breaking the shackles of cloud/mainframe/servers/off-end-user-device and migrating to run on peoples devices. It feels "democratic".



I remember having lunch with a speech recognition researcher who was using GPUs to train DNNs to do speech recognition in 2011. It really was thought of as niche back then. But the writing was on the wall I guess in the results they were getting.


Apple are. Their “Personal Voice” feature fine tunes a voice model on device using recordings of your own voice.

An older example is the “Hey Siri” model, which is fine tuned to your specific voice.

But with regards to on device training, I don’t think anyone is seriously looking at training a model from scratch on device, that doesn’t make much sense. But taking models and fine tuning them to specific users makes a whole ton of sense, and an obvious approach to producing “personal” AI assistants.

[1] https://support.apple.com/en-us/104993



They already do some “simple” training on device. The example I can think of is photo recognition in the photo library. It likely builds on something else but being able to identify which phase is your grandma versus your neighbor is not done in Apple‘s cloud. It’s done when your devices are idle and plugged into power.

A few years ago it wasn’t shared between devices so each device had to do it themselves. I don’t know if it’s shared at this point.

I agree you’re not going to be training an LLM or anything. But smaller tasks limited and scope may prove a good fit.



Not really (I work on AI/ML Infrastructure at a well known tech company and talk regularly w/ our peer companies).

That said, inference on apple products is a different story. There's definitely interest in inference on the edge. So far though, nearly everyone is still opting for inference in the cloud for two reasons:

1. There's a lot of extra work involved in getting ML/AI models ready for mobile inference. And this work is different for iOS vs. Android 2. You're limited on which exact device models will run the thing optimally. Most of your customers won't necessarily have that. So you need some kind of fallback. 3. You're limited on what kind of models you can actually run. You have way more flexibility running inference in the cloud.



A cloud solution I looked at a few years ago could be replicated (poorly) in your browser today. In my mind the question has become one of determining when my model is useful enough to detach from the cloud, not whether that should happen.


Mobile can be more efficient. But you're making big tradeoffs. You are very limited in what you can actually run on-device. And ultimately you're also screwing over your user's battery life, etc.


I’ve found it to be pretty terrible compared to CUDA, especially with Huggingface transformers. There’s no technical reason why it has to be terrible there though. Apple should fix that.


MLX will probably be even faster than that, if the model is already ported. Faster startup time too. That’s my main pet peeve though: there’s no technical reason why PyTorch couldn’t be just as good. It’s just underfunding and neglect


Does one need to train an AI model on specific hardware, or can a model be trained in one place and then used somewhere else? Seems like Apple could just run their fine tuned model called Siri on each device. Seems to me like asking for training on Apple devices is missing the strategy. Unless of course, it's just for purely scientific $reasons like "why install Doom on the toaster?" vs doing it for a purpose.


It doesn’t require specific hardware; you can train a neural net with pencil and paper if you have enough time. Of course, some pieces of hardware are more efficient than others for this.


For business-scale model work, sure.

But you can get an M2 Ultra with 192GB of UMA for $6k or so. It's very hard to get that much GPU memory at all, let alone at that price. Of course the GPU processing power is anemic compared to a DGX Station 100 cluster, but the mac is $143,000 less.



You want to buy a bunch of new equipment to do training? Yeah Mac’s aren’t going to make sense.

You want your developers to be able to do training locally and they already use Macs? Maybe an upgrade would make business sense. Even if you have beefy servers or the cloud for large jobs.



Yes, it can be more cost effective for smaller businesses to do all their work on Mac Studios, versus having a dedicated Nvidia rig plus Apple or Linux hardware for your workstation.

Honestly, you can train basic models just fine on M-Series Max MacBook Pros.



A non-decked out Mac Studio is a hell of a machine for $1999.

Do you also compare cars by looking at only the super expensive limited editions, with every single option box ticked?

I'd also point out that said 3 year old $1999 Mac Studio that I'm typing this on already runs ML models usefully, maybe 40-50% of the old 3000-series Nvidia machine it replaces, while using literally less than 10% of the power and making a tiny tiny fraction of the noise.

Oh, and it was cheaper. And not running Windows.



No.

For training the Macs do have some interesting advantages due to the unified memory. The GPU cores have access to all of system RAM (and also the system RAM is ridiculously fast - 400GB/sec when DDR4 is barely 30GB/sec, which has a lot of little fringe benefits of it's own, part of why the Studio feels like an even more powerful machine than it actually is. It's just super snappy and responsive, even under heavy load.)

The largest consumer NVidia card has 22GB of useable RAM.

The $1999 Mac has 32GB, and for $400 more you get 64GB.

$3200 gets you 96GB, and more GPU cores. You can hit the system max of 192GB for $5500 on an Ultra, albeit it with the lessor GPU.

Even the recently announced 6000-series AI-oriented NVidia cards max out at 48GB.

My understanding is a that a lot of enthusiasts are using Macs for training because for certain things having more RAM is just enabling.



The huge amount of optimizations available on Nvidia and not available on Apple make the reduced VRAM worth it, because even the most bloated of foundation models will have some magical 0.1bit quantization technique be invented by a turbo-nerd which only works on Nvidia.

I keep hearing this meme of Mac's being a big deal in LLM training, but I have seen zero evidence of it, and I am deeply immersed in the world of LLM training, including training from scratch.

Stop trying to meme apple M chips as AI accelerators. I'll believe it when unsloth starts to support a single non-nvidia chip.



Yeah, and I think people forget all the time that inference (usually batch_size=1) is memory bandwidth bound, but training (usually batch_size=large) is usually compute bound. And people use enormous batch sizes for training.

And while the Mac Studio has a lot of memory bandwidth compared to most desktops CPUs, it isn't comparable to consumer GPUs (the 3090 has a bandwidth of ~936GBps) let alone those with HBM.

I really don't hear about anyone training on anything besides NVIDIA GPUs. There are too many useful features like mixed-precision training, and don't even get me started on software issues.



If you work for a company willing to shell out sure there are better options.

But for individual developers it’s an interesting proposition.

And a bigger question is: what if you already have (or were going to buy) a Mac? You prefer them or maybe are developing for Apple platforms.

Upping the chip or memory could easily be cheaper than getting a PC rig that’s faster for training. That may be worth it to you.

Not everyone is starting from zero or wants the fastest possible performance money can buy ignoring all other factors.



Don't attack me, I'm not disagreeing with you that an nVidia GPU is far superior at that price point.

I simply want to point out that these folks don't really care about that. They want a Mac for more reasons than "performance per watt/dollar" and if it's "good enough", they'll pay that Apple tax.

Yes, yes, I know, it's frustrating and they could get better Linux + GPU goodness with an nVidia PC running Ubuntu/Arch/Debian, but macOS is painless for the average science AI/ML training person to set up and work with. There are also known enterprise OS management solutions that business folks will happily sign off on.

Also, $7000 is chump change in the land of "can I get this AI/ML dev to just get to work on my GPT model I'm using to convince some VC's to give me $25-500 million?"

tldr; they're gonna buy a Mac cause it's a Mac and they want a Mac and their business uses Mac's. No amount of "but my nVidia GPU = better" is ever going to convince them otherwise as long as there is a "sort of" reasonable price point inside Apple's ecosystem.



> a dedicated Nvidia rig

I am honestly shocked Nvidia has been allowed to maintain their moat with cuda. It seems like AMD would have a ton to gain just spending a couple million a year to implement all the relevant ML libraries with a non-cuda back-end.



AMD doesn’t really seem inclined toward building developer ecosystems in general.

Intel seems like they could have some interesting stuff in the annoyingly named “OneAPI” suite but I ran it on my iGPU so I have no idea if it is actually good. It was easy to use, though!



There are quite a few back and forth X/Twitter storms in teacups between George Hotz / tinygrad and the AMD management about opening up the firmware for custom ML integrations to replace CUDA but last I checked they were running into walls


My cynical view is that doing AI on the client is the only way they can try to keep selling luxury items (jewelry really) and increasing prices for what are essentially and functionally commodity devices.


I'm all for running as much on the edge as possible, but we're not even close to being able to do real-time inference on Frontier models on Macs or iPads, and that's just for vanilla LLM chatbots. Low-precision Llama 3-8b is awesome, but it isn't a Claude 3 replacer, totally drains my battery, and is slow (M1 Max).

Multimodal agent setups are going to be data center/home-lab only for at least the next five years.

Apple isn't about to put 80GB on VRAM in an iPad for about 15 reasons.



Yes, that would be great. But without the ability for us to verify this who's to say they won't use the edge resources(your computer and electricity) to process data(your data) and then send the results to their data center? It would certainly save them a lot of money.


When you can do all inference at the edge, you can keep it disconnected from the network if you don't trust the data handling.

I happen to think they wouldn't, simply because sending this data back to Apple in any form that they could digest it is not aligned with their current privacy-first strategies. But if they make a device that still works if it stays disconnected, the neat thing is that you can just...keep it disconnected. You don't have to trust them.



Except that's an unreasonable scenario for a smart phone. It doesn't prove that the minute the user goes online it won't be egressing data willingly or not.


I don't disagree, although when I composed my comment I had desktop/laptop in mind, as I think genuinely useful on-device smartphone-AI is a ways of yet, and who knows what company Apple will be by then.


If you trust that Apple doesn't film you with the camera when you use the phone while sitting on the toilet. Why wouldn't you trust Apple now?

It would have to be a huge conspiracy with all Apples employees. And you can easily just listen to the network and see if they do it or not.



I find it somewhat hard to believe that wouldn’t be in contravention of some law or other. Or am I wrong?

Of course we can then worry that companies are breaking the law, but you have to draw the line somewhere… and what have they to gain anyway?



+1 The idea that it's on device, hence it's privacy-preserving is Apple's marketing machine speaking and that doesn't fly anymore. They have to do better to convince any security and privacy expert worth their salt that their claims and guarantees can be independently verified on behalf of iOS users.

Google did some of that on Android, which means open-sourcing their on-device TEE implementation, publishing a paper about it etc.



There is no guarantee that local processing is going to have lower latency than remote processing. Given the huge compute needs of some AI models (e.g. chat gpt) the time saved by using larger compute likely dwarfs the relatively small time need to transmit a request.


For everyone else who doesn't understand what this means, he's saying Apple wants you to be able to run models on their devices, just like you've been doing on nvidia cards for a while.


I think he's saying they want to make local AI a first class, default, capability, which is very unlike buying a $1k peripheral to enable it. At this point (though everyone seems to be working on it), other companies need to include a gaming GPU in every laptop, and tablet now (lol), to enable this.


Awesome. I'm going to go tell my mom she can just pull her Nvidia card out of her pocket at the train station to run some models.

On second thought.. maybe it isn't "just like you've been doing on Nvidia cards for a while"



Yes at the end its just some data representing user's trained model. Is there a contractual agreement with users that apple will never ever transfer a single byte of those, otherwise huge penalties will happen? If not, its pinky PR promise that sounds nice.


Apple publicly documents their privacy and security practices.

At minimum, laws around the world prevent companies from knowingly communicating false information to consumers.

And in many countries the rules around privacy are much more stringent.



But what does that have to do with the price of milk in Turkmenistan.

Because Boeing's issues have nothing to do with privacy or security and since they are not consumer facing have no relevance to what we are talking about.



Yes is it completely clear. My guess is they do something like "Siri-powered shortcuts". Where you can ask it to do a couple things and it'll dynamically create a script and execute it.

I can see a smaller model trained to do that may work well enough, however, I've never seen any real working examples of this work, that rabit device is heading in that direction, but it's mostly vaporware now.



Has nothing to do with privacy, google is also pushing gemini nano to the device. The sector is discovering the diminishing returns of LLMs.

With the ai cores on phones they can cover your average user use cases with a light model without the server expense.



I think these days everyone links their products with AI. Today even BP CEO linked his business with AI. Edge inference and cloud inference are not mutually exclusive choices. Any serious provider will provide both and the improvement in quality of services come from you giving more of your data to the service provider. Most people are totally fine with that and that will not change any time sooner. Privacy paranoia is mostly a fringe thing in consumer tech.


I agree. Apple has been on this path for a while, the first processor with a Neural Engine was the A11 in 2017 or so. The path didn’t appear to change at all.

The big differences today that stood out to me were adopting AI as a term (they used machine learning before) and repeating the term AI everywhere they could shove it in since that’s obviously what the street wants to hear.

That’s all that was different. And I’m not surprised they emphasized it given all the weird “Apple is behind on AI“ articles that have been going around.



> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.

That's one particular set of trade-offs, but not necessarily the best. Eg if your network connection and server processing speed is sufficiently faster than your local processing speed, the latency would be higher for doing it locally.

Local inference can also use more battery power. And you need a more beefy device, all else being equal.



Something they should be able to do now, but do not seem to, is to allow you to train Siri to recognize exactly your voice and accent. Which is to say, to take the speech-to-text model that is listening and putting it into the Siri integration API, to both be 99.99% accurate for your speech and to recognize you and only you when it comes to invoking voice commands.

It could, if it chose to, continue to recognize all voices but at the same time limit the things the non-owner could ask for based on owner preferences.



This is really easy to do: it's just an embedding of your voice. So typically like 10/30 sec max of your voice to configure this. You already do a similar setup for faceId. I agree with you, I don't understand why they don't do it.


I hope this means AI-accelerated frameworks get better support on Mx. Unified memory and Metal are a pretty good alternative for local deep learning development.


>Apple's AI strategy is to put inference (and longer term even learning) on edge devices.

Apple's AI strategy is to put inference (and longer term even learning) on edge devices...only for Apple stuff.

There is a big difference. ANE right now is next to useless for anything not Apple.



Ehhh at this point Apple’s privacy strategy is little more than marketing. Sure they’ll push stuff to the edge to save themselves money and book the win, but they also are addicted to the billions they make selling your searches to Google.

Agreed on the UX improvements though.



Right. The difference is that Apple has a ton of edge capacity, they’ve been building it for a long time.

Google and Samsung have been building it too, at different speeds.

Intel and AMD seem further behind (at the moment) unless the user has a strong GPU, which is especially uncommon on the most popular kind of computer: laptops.

And if you’re not one of those four companies… you probably don’t have much capable consumer edge hardware.



So for hardware accelerated training with something like PyTorch, does anyone have a good comparison between Metal vs Cuda, both in terms of performance and capabilities?


I've been saying the same thing since ANE and the incredible new chips with shared ram, suddenly everyone could run capable local models - but then Apple decided to be catastrophically stingy once again putting ridiculous 8gb's of ram in these new iPads' and their new macbook air's destroying having a widespread "intelligent local siri" because now half the new generation can't run anything.

Apple is an amazing powerhouse but also disgustingly elitist and wasteful if not straight up vulgar in its profit motives. There's really zero idealism there despite their romantic and creative legacy.

There's always some straight idiotic limitations in their otherwise incredible machines, with no other purpose than to create planned obsolescence, "PRO" exclusivity and piles e-waste.



And yet Siri is super slow because it does the processing off-device, and is far less useful than it could be because it is cobbled with restrictions.

I can't even find a way to resume playing whatever Audible book I was last playing. "Siri play audible" or something. As far as I know, this is impossible to do.



These don’t sit on the edge of the internet , and typically are not called edge devices.

It’s usually a more powerful device such as a router or mini server between LAN and internet.



How is local more private? Whether AI runs on my phone or in a data center I still have to trust third parties to respect my data. That leaves only latency and connectivity as possible reasons to wish for endpoint AI.


If you can run AI in airplane mode, you are not trusting any third party, at least until you reconnect to the Internet. Even if the model was malware, it wouldn’t be able to exfiltrate any data prior to reconnecting.

You’re trusting the third party at training time, to build the model. But you’re not trusting it at inference time (or at least, you don’t have to, since you can airgap inference).



Honestly, if they manage this, they have my money. But to get actually powerful models running, they need to supply the devices with enough RAM - and that's definitely not what Apple like to do.


This comment is odd. I wouldn't say it is misleading, but it is odd because it borders on such definition.

> Apple's AI strategy is to put inference (and longer term even learning) on edge devices

This is pretty much everyone's strategy. Model distillation is huge because of this. This goes in line with federated learning. This goes in line with model pruning too. And parameter efficient tuning and fine tuning and prompt learning etc.

> This is completely coherent with their privacy-first strategy

Apple's marketing for their current approach is privacy-first. They are not privacy first. If they were privacy first, you would not be able to use app tracking data on their first party ad platform. They shut it off for everyone else but themselves. Apple's approach is walled garden first.

> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity

as long as you don't depend on graph centric problems where keeping a local copy of that graph is prohibitive. Graph problems will become more common. Not sure if this is a problem for apple though. I am just commenting in general.

> If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips

Apple does not have a good track record of this; they are quite antagonistic when it comes to this topic. Gaming on apple was dead for nearly a decade (and pretty much still is) because steve jobs did not want people gaming on macs. Apple has eased up on this, but it very much seems that if they want you to use their devices (not yours) in a certain way, then they make it expensive to do anything else.

Tbf, I don't blame apple for any of this. It is their strategy. Whether it works or not, it doesn't matter. I just found this comment really odd since it almost seemed like evangelism.

edit: weird to praise apple for on device training when it is not publicly known if they have trained any substantial model even on cloud.



Everyone’s strategy?

The biggest players in commercial AI models at the moment - OpenAI and Google - have made absolutely no noise about pushing inference to end user devices at all. Microsoft, Adobe, other players who are going big on embedding ML models into their products, are not pushing those models to the edge, they’re investing in cloud GPU.

Where are you picking up that this is everyone’s strategy?



I believe at least Google is starting to do edge inference—take a look at the pixel 8 line-up they just announced. It doesn't seem to be emphasized as much, but the tensor G3 chip certainly has builtin inference.


> Where are you picking up that this is everyone’s strategy?

Read what their engineers say in public. Unless I hallucinated years of federated learning.

Also apple isn't even a player yet and everyone is discussing how they are moving stuff to the edge lol. Can't critique companies for not being on the edge yet when apple doesn't have anything out there.



> This is pretty much everyone's strategy.

I think this is being too charitable on the state of "everyone". It's everyone's goal. Apple is actively achieving that goal, with their many year strategy of in house silicon/features.



> Apple is actively achieving that goal, with their many year strategy of in house silicon/features

So are other companies, with their many year strategy of actually building models that accessible to the public.

yet Apple is "actively" achieving the goal without any distinct models.



No. "On edge" is not a model existence limitation, it is a hardware capability/existence limitation, by definition, and by the fact that, as you point out, the models already exist.

You can already run those open weight models on Apple devices, on edge, with huge improvements on the newer hardware. Why is a distinct model required? Do the rumors appease these thoughts?

If others are making models, with no way to actually run them, that's not a viable "on edge" strategy, since it involves waiting for someone else to actually accomplish the goal first (as is being done by Apple).



> "On edge" is not a model existence limitation

It absolutely is. Model distillation will still be pertinent. And so will be parameter efficient tuning for edge training. I cannot emphasize more how important this is. You will need your own set of weights. If apple wants to use open weights, then sure. Ignore this. Don't seem like they want to long-term... And even if they use open weights, they will still be behind other companies have done model distillation and federated learning for years.

> Why is a distinct model required?

Ask apple's newly poached AI hires this question. Doesn't seem like you would take an answer from me.

> If others are making models, with no way to actually run them

Is this the case? People have been running distilled llamas on rPis with pretty good throughput.



> And even if they use open weights, they will still be behind other companies have done model distillation and federated learning for years.

I'm sorry, but we're talking about "on edge" here though. Those other companies have no flipping hardware to run it "on edge", in a "generic" way, which is the goal. Apple's strategy involves the generic.

> If apple wants to use open weights

This doesn't make sense. Apple doesn't dictate the models you can use with their hardware. You can already accelerate LLAMA with the neural engines. You can download the app right now. You can already deploy your models on edge, on their hardware. That is the success they're achieving. You cannot effectively do this on competitor hardware, with good performance, from "budget" to "Pro" lineup, which is a requirement of the goal.

> they will still be behind other companies have done model distillation and federated learning for years.

What hardware are they running it on? Are they taking advantage of Apple (or other) hardware in their strategy? Federated learning is an application of "on edge", it doesn't *enable* on edge, which is part of Apple's strategy.

> Ask apple's newly poached AI hires this question. Doesn't seem like you would take an answer from me.

Integrating AI in their apps/experience is not the same as enabling a generic "on edge", default, capability in all Apple devices (which they have been working towards for years now). This is the end goal for "on edge". You seem to be talking about OS integration, or something else.

> People have been running distilled llamas on rPis with pretty good throughput.

Yes, the fundamental limitation there being hardware performance, not the model, with that "pretty good" making the "pretty terrible" user experience. But, there's also nothing stopping anyone from running these distilled (a requirement of limited hardware) models on Apple hardware, taking advantage of Apples fully defined "on edge" strategy. ;) Again, you can run llamas on Apple silicon, accelerated, as I do.



> Those other companies have no flipping hardware to run it "on edge", in a "generic" way, which is the goal

Maybe? This is why I responded to:

> It's everyone's goal. Apple is actively achieving that goal

This is is the issue I found disagreeable. Other organizations and individual people are achieving that goal too. Google says GPT-Nano is going to device, and if the benchmarks are to be believed, if it runs at that level, their work so far is also actively achieving that goal. Meta has released multiple distilled models that people have already proven to run inference at the device level. It cannot be argued that meta is not actively achieving that goal either. They don't have to release the hardware because they went a different route. I applaud Apple for the M chips. They are super cool. People are still working on using them so Apple can realize that goal too.

So when you go to the statement that started this

> Apple's AI strategy is to put inference (and longer term even learning) on edge devices

Multiple orgs also share this. And I can't say that one particular org is super ahead of the others. And I can't elevate apple in that race because it is not clear that they are truly privacy-focused or that they will keep APIs open.

> You cannot effectively do this on competitor hardware, with good performance, from "budget" to "Pro" lineup, which is a requirement of the goal

Why do you say you cannot do this with good performance? How many tokens do you want for a device? Is 30T/s enough? You can do that on laptops running small mixtral.

> What hardware are they running it on? Are they taking advantage of Apple (or other) hardware in their strategy?

I don't know. I have nothing indicating necessarily apple or nvidia or otherwise. Do you?

> [Regarding the rest]

Sure, my point is that they definitely have an intent for bespoke models. And why I raised the point that not all computation will be feasible on edge for the time being. My point with what raised this particular line of inquiry is whether a pure edge experience truly enables the best user experience. And also why I raised the point about Apple's track record of open APIs. Which is why "actively achieving" is something that I put doubt on. And I also cast doubt on apple being privacy focused. Just emphasize tying it back to the reason I even commented.



> This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

I mean yeah, that makes good marketing copy, but its more due to reducing latency and keeping running costs down.

but as this is mostly marketing fluff we'll need to actually see how it performs before casting judgment on how "revolutionary" it is.



> to put inference on edge devices...

It will take a long time before you can put performant inference on edge device.

Just download one of the various open source large(st) langage model and test it on your desktop...

Compute power and memory and storage requirements are insane if you want decent result... I mean not just Llama gibberish.

Until such requirement are satisfied, distant model are the way to go, at least for conversational model.

Aside llm, AlphaGo would not run on any end user device, by a long shot, even if it is an already 'old' technology.

I think 'neural engine' on end user device is just marketing nonsense at this current state of the art.



>I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.

Watch out for being able to using ai on your local machine and those ai services using telemetry to send your data (recorded conversations, for instance) to their motherships.



I agree but for a different reason.

Now the subscription is 20$ a month and the API price is accessible. What will happen when they all decide to x100 or x1000 the price of their API ? All the companies that got rid of people in favor of AI, might have lost the knowledge as well. This is dangerous and might kill a lot of companies no ?



> and those ai services using telemetry to send your data (recorded conversations, for instance) to their motherships

This doesn’t require Ai and I am not aware of any instances of this happening today, so what exactly are we watching out for?



On “privacy”: If Apple owned the Search app versus paying Google, and used their own ad network (which they have for App Store today), Apple will absolutely use your data and location etc to target you with ads.

It can even be third party services sending ad candidates directly to your phone and then the on-device AI chooses which is relevant.

Privacy is a contract not the absence of a clear business opportunity. Just look at how Apple does testing internally today. They have no more respect for human privacy than any of their competitors. They just differentiate through marketing and design.



> This is completely coherent with their privacy-first strategy

How can they have a privacy first strategy when they operate an Ad network and have their Chinese data centers run by state controlled companies?



... I think that the more correct assertion would be that Apple is a sector leader in privacy. If only because their competitors make no bones about violating the privacy of their customers as it is the basis of thier business model. So it's not that Apple is A+ so much as the other students are getting Ds and Fs.


Prob beacuse they are like super-behind in the cloud space, it is not like they wouldn't like to sell the service. They ignored photos privacy quite a few times in the icloud.


> This is completely coherent with their privacy-first strategy

Apple has never been privacy-first in practice. They give you the illusion of privacy but in reality it's a closed-source system and you are forced to trust Apple with your data.

They also make it a LOT harder than Android to execute your own MITM proxies to inspect what exact data is being sent about you by all of your apps including the OS itself.



You say that like open source isn't also an illusion of trust.

The reality is, there's too much to verify, and not enough interest for the "many eyeballs make all bugs shallow" argument.

We are, all of us, forced to trust, forced to go without the genuine capacity to verify. It's not great, and the best we can do is look for incentives and try to keep those aligned.



Open source is like democracy. Imperfect and easy to fuck up, but still by far the best thing available.

Apple is absolutism. Even the so called "enlightened" absolutism is still bad compared to average democracy.



I don't agree with relying on the many eyeballs argument for security, but from a privacy standpoint, I do think at least the availability of source to MY eyeballs, as well as the ability to modify, recompile, and deploy it, is better than "trust me bro I'm your uncle Steve Jobs and I know more about you than you but I'm a good guy".

If you want to, for example, compile a GPS-free version of Android that appears like it has GPS but in reality just sends fake coordinates to keep apps happy thinking they got actual permissions, it's fairly straightforward to make this edit, and you own the hardware so it's within your rights to do this.

Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source. Closed source would be fine if they allowed me to easily inject my own root certificate for this purpose. If they aren't willing to do that, including a 1-click replacement of the certificates in various third-party, certificate-pinning apps that are themselves potential privacy risks, it's a fairly easy modification to any open source system.

A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.



> Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source.

I agree; unfortunately it feels as if this ship has not only sailed, but the metaphor would have to be expanded to involve the port at well.

Is it even possible, these days, to have a functioning experience with no surprise network requests? I've tried to limit mine via an extensive hosts file list, but that did break stuff even a decade ago, and the latest version of MacOS doesn't seem to fully respect the hosts file (weirdly it partially respects it?)

> A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.

I remember reading a tale about someone, I think it was a court case or an audit, who wanted every IP packet to be printed out on paper. Only backed down when the volume was given in articulated lorries per hour.

I sympathise, but you're reminding me of that.



> Apple has never been privacy-first in practice > They also make it a LOT harder than Android to execute your own MITM proxies

I would think ease of MITM and privacy are opposing concerns



Yeah, given that they resisted putting RCS in iMessage so long, I am a bit skeptical about the whole privacy narrative. Especially when Apple's profit is at odds with user privacy.


From my understanding, the reason RCS was delayed is because Google's RCS was E2EE only in certain cases (both users using RCS). But also because Google's RCS runs through Google servers.

If Apple enabled RCS in messages back then, but the recipient was not using RCS, then Google now has the decrypted text message, even when RCS advertises itself as E2EE. With iMessage, at least I know all of my messages are E2EE when I see a blue bubble.

Even now, RCS is available on Android if using Google Messages. Yes, it's pre-installed on all phones, but OEMs aren't required to use it as the default. It opens up more privacy concerns because now I don't know if my messages are secure. At least with the green bubbles, I can assume that anything I send is not encrypted. With RCS, I can't be certain unless I verify the messaging app the recipient is using and hope they don't replace it with something else that doesn't support RCS.



You know what would really help Apple customers increase their privacy when communicating with non-Apple devices?

Having iMessage available to everyone regardless of their mobile OS.



RCS is a net loss for privacy: it gives the carriers visibility into your social graph and doesn’t support end to end encryption. Google’s PR campaign tried to give the impression that RCS supports E2EE but it’s restricted to their proprietary client.


> rooted devices are denied access to it

By what? It's impossible for a process to know for sure if the system is rooted or not. A rooted system can present itself to a process to look like a non-rooted system if it's engineered well enough.

I'd bet that most of these apps probably just check if "su" returns a shell, in which case perhaps all that's needed is to modify the "su" executable to require "su --magic-phrase foobar" before it drops into a root shell, and returns "bash: su: not found" or whatever if called with no arguments.



>A rooted system can present itself to a process to look like a non-rooted system if it's engineered well enough.

That was true 20 years ago, but most smartphones these days have cryptograhically-verified boot chains and remote attestation of how the boot went.



> This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

I feel like people are being a bit naïve here. Apple's "Privacy First" strategy was a marketing spin developed in response to being dead-last in web-development/cloud computing/smart features.

Apple has had no problem changing their standards by 180 degrees and being blatantly anti-consumer whenever they have a competitive advantage to do so.



Having worked at Apple I can assure you it's not just spin. It's nigh on impossible to get permission to even compare your data with another service inside of Apple and even if you do get permission the user ids and everything are completely different so theres no way to match up users. Honestly its kind of ridiculous the lengths they go to and makes development an absolute PITA.


That could very well be true, but I also think it could change faster than people realize. Or that Apple has the ability to compartmentalize (kind of like how Apple can advocate for USB C adoption in some areas and fight it in others).

I'm not saying this to trash Apple - I think it's true of any corporation. If Apple starts losing revenue in 5 years because their LLM isn't good enough because they don't have enough data, they are still going to take it and have some reason justifying why theirs is privacy focused and everyone else is not.



Of course! The difference is that, for the time being, my incentives are aligned with theirs in regards to preserving my privacy.

The future is always fungible. Anyone can break whatever trust they've built very quickly. But, like the post you are replying to, I have no qualms about supporting companies that are currently doing things in my interest and don't have any clear strategic incentive to violate that trust.

Edit: that same incentive structure would apply to NVIDIA, afaik



I can't agree with your comment. apple has all the incentives to monetize your data, that's the whole value of Google and Meta. And they are already heading into ad-business earning billions last I've checked. Hardware ain't selling as much as before, this isn't going to change for the better in foreseeable future.

The logic is exactly same as ie Meta claims - we will pseudoanonymize your data, so technically your specific privacy is just yours, see nothing changed. But you are in various target groups for ads, plus we know how 'good' those anon efforts are when money are at play and corporations are only there to earn as much money as possible. Rest is PR.



I'll disagree with your disagreement - in part at least. Apple is still bigger than Meta or Google. Even if they had a strong channel to serve ads or otherwise monetize data, the return would represent pennies on the dollar.

And Apple's privacy stance is a moat against these other companies making money off of their customer base. So for the cost of pennies on the dollar, they protect their customer base and ward off competition. That's a pretty strong incentive.



Don't bother the fanboys have an Apple can't do anything wrong/malicious. At this point it's closer to a religion than ever.

You would be amazed at the response of some of them when I point out some shit Apple does that make their products clearly lacking for the price, the cognitive dissonance is so strong they don't know how to react in any other way than lying or pretending it doesn't matter.



If you’re annoyed about quasi-religious behavior, consider that your comment has nothing quantifiable and contributed nothing to this thread other than letting us know that you don’t like Apple products for non-specific reasons. Maybe you could try to model the better behavior you want to see?


> This is completely coherent with their privacy-first strategy

Is this the same apple whose devices do not work at all unless you register an apple account?



Some people really seem to be truly delusional. It's obvious that the company's "privacy" is a marketing gimmick when you consider the facts. Do people not consider the facts anymore? How does somebody appeal to the company's "privacy-first strategy" with a straight face in light of the facts? I suppose they are not aware of the advertising ID that is embedded in all Apple operating systems. That one doesn't even require login.


A "mistake" seems to be putting it lightly when the thing has been reiterated multiple times throughout the years, but yeah. Seems more like blind dogma. Obviously people don't like the facts pointed out to them either as you can tell by the down votes on my comment. If I am wrong, please tell me how in a reply.


their privacy strategy is to make you feel comfortable with their tech so you don't mind when they shop it around to the highest bidder.

Make no mistake, they're just waiting for the right MBA to walk through the door, see the sky high value of their users and start chop shopping that.

Enshitiffication is always available to the next CEO, and this is just going to be more and more tempting as the value of the walled garden increases.



Yes, it’s possible that they’ll change in the future but that doesn’t make it inevitable. Everything you describe could have happened at any point in the last decade or two but didn’t, which suggests that it’s not “waiting for the right MBA” but an active effort to keep the abusive ones out.

One thing to remember is that they understand the value of long-term investments. They aren’t going to beat Google and Facebook at advertising and have invested billions in a different model those companies can’t easily adopt, and I’m sure someone has done the math on how expensive it would be to switch.



- dozens of horrific 0days cves every year because not enough is invested in security, making private virtually impossible

- credit card required to install free apps such as the "private" Signal messenger

- location required just to show me the weather in a static place, lol

- claims to be e2e but apple controls all keys and identities

- basically sells out all users' icloud data in china, and totally doesn't do the same in the US, because tim pinky swears

- everything is closed source



> complete independence of network connectivity and hence minimal latency.

Does it matter that each token takes additional milliseconds on the network if the local inference isn't fast? I don't think it does.

The privacy argument makes some sense, if there's no telemetry leaking data.



> This is completely coherent with their privacy-first strategy (...)

I think you're trying too hard to rationalize this move as pro-privacy and pro-consumer.

Apple is charging a premium for hardware based on performance claims, which they need to create relevance and demand for it.

There is zero demand for the capacity for running computationally demanding workloads beyond very niche applications, for what classifies as demanding for the consumer-grade hardware being sold for the past two decades.

If Apple offloads these workloads to the customer's own hardware, they don't have to provide this computing capacity themselves. This means no global network of data centers, no infrastructure, no staff, no customer support, no lawyer, nothing.

More importantly, Apple claims to be pro privacy but their business moves are in reality in the direction of ensuring that they are in sole control of their users' data. Call it what you want but leveraging their position to ensure they hold a monopoly over a market created over their userbase is not a pro privacy move, just like Apple's abuse of their control over the app store is not a security move.

联系我们 contact @ memedata.com