To any Linux users, I recently bought a fully loaded M4 MacBook pro to replace my aging Lenovo and strongly regret it. I thought I would use it for playing with LLMs, but local dev on a Mac is not fun and I still don't have it fully set up. I'll probably replace it with a framework at some point in the near future.
Edit: okay, that garnered more attention than I expected, I guess I owe a qualification.
1. Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
2. Not everything is supported natively on arm64. I had an idea and wanted to spin up a project using DynamoRIO, but wasn't supported. Others have mentioned the docker quirks.
3. The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
So my person takeaway was that I took the openness of the Linux ecosystem for granted (I've always had a local checkout of the kernel so I can grep an error message if needed). Losing that for me felt like wearing a straightjacket. Ironically I have a MBP at work, but spend my day ssh'd into a Linux box. It's a great machine for running a web browser and terminal emulator.
lholden 4 hours ago [-]
I ended up doing something similar a few years ago. Picked up a MacBook Pro M1 Max back when the M1 stuff was new to replace an aging Lenovo running Linux. I actually really loved my Lenovo + Linux, but the M1 was new and shiny and I desperately wanted better battery life.
The hardware was great, but life on a Mac always felt a bit convoluted. Updating the OS was especially frustrating as a software developer because of all the interdependent bits (xcode, brew, etc) that often ended up breaking my dev environment in some way. It also always amazed me at the stuff that was missing. Like, how isn't the default terminal app fully functional after all these years? On the plus side, over the time I used it they did add tiling and the ability to hide the notch.
Finally at the start of the year I moved back to Linux and couldn't be happier. Had forgotten just how nice it is to have everything I need out of the box. The big thing I miss is Affinity Photo, though that looks like it's in the middle of dying right now.
skydhash 2 hours ago [-]
I have an M1 air and 8th gen intel Dell (openbsd) and I’m much happier wit the Dell for hacking on stuff. MacOS is pretty much a nightmare if your workflow is not apps and IDE centered.
abrookewood 2 hours ago [-]
Pretty sure you can run Asahi on that? Might have been worth the effort instead of swapping out the machine as it's still pretty capable.
coldtea 9 hours ago [-]
>I thought I would use it for playing with LLMs, but local dev on a Mac is not fun and I still don't have it fully set up.
Sounds more like a you problem, probably due to unfamiliarity. There are endless options for local dev on a Mac, and a huge share of devs using one.
heavyset_go 5 hours ago [-]
I've used Macs for 20 years starting on the day 32-bit Intel Macs were released, and agree with the GP. Linux and Plasma spoiled me, going back to macOS and its windowing system feels like a step backward, especially for development, where using multiple windows is a must. Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
Yes, I know about Yabai and the other things that modify the existing window manager. The problem is the window manager itself.
Outside of the windowing system, running native Linux if you're deploying to Linux beats using an amalgamation of old BSD utils + stuff from Homebrew and hoping it works between platforms, or using VMs. The dev tools that are native to Linux are also nice.
When it comes to multiple monitors, I want a dock on each monitor. I can do that in Plasma, but I can't in macOS, unless I use some weird 3rd party software apparently.
0x457 3 hours ago [-]
When you use linux as desktop, sometimes you get into a customization-hole and make everything "just right" because on linux everything is customizable.
Then you switch to macOS or Windows or even (not your) linux setup and hate it. When I manage to contain myself entirely to the terminal it's okay, but the moment I have to interact with GUI I start to miss those "just right" things.
I can relate. macOS hilariously sucks on certain GUI and terminal aspects. Not much you can do about GUI, just have to adapt to the way macOS wants to be used. For terminal, I use home-manager to manage my $HOME. It not space efficient and public caches are sub-par, but it's better than searching "sed in-place repace macos and linux cross-platform" for the 9000th time.
sotix 3 hours ago [-]
I'll echo the sentiment about being very familiar with macOS but being spoiled by Linux and KDE Plasma. I put up with my work MacBook. My personal Linux setup just works and gets out of the way as a machine.
Sharlin 4 hours ago [-]
> Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
That just sounds like being accustomed to one way of switching tasks, honestly. If I want previews, I use Expose (three-finger swipe up/down or ctrl-up/down). But mostly I just use cmd-tab and haven't really needed to see previews there. Because macOS switches between applications, not windows, often there isn't single window to preview, and I'm not sure showing all the windows would work well either. For Expose it works well because the it can use the entire screen to show previews.
16 minutes ago [-]
afandian 2 hours ago [-]
The app switching behaviour is really infuriating. Selecting a window and having all the app’s windows come to the fore, obscuring the window from another app is still annoying, 20 years on.
And then when you full-screen a window, switch to another app for a moment, and then you can’t find it without delving into the ‘window‘ menu.
Agreed, as a software engineer of ~8 years now Mac is actually my _preferred_ environment -- I find it an extremely productive OS for development whether I'm working on full stack or Unity game dev in my free time.
deaddodo 9 hours ago [-]
I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
robenkleene 8 hours ago [-]
> I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
This sounds like you think macOS is a good dev environment, but that you personally don't like the UI/UX (always safer to make UI/UX judgements subjective ["I don't like"] rather than objective ["it's bad"], since it's so difficult to evaluate objectively, e.g., compared to saying something like Docker doesn't run natively on macOS, which is just an objective fact).
deaddodo 4 hours ago [-]
I literally started off my comment saying that it's not bad. That means it's neutral-good, by definition.
I can easily develop on both, I prefer developing on Linux. Thus, it is "more good" (IMO), if you prefer.
robenkleene 4 hours ago [-]
Sorry, you're 100% right, I misread your comment.
SomeUserName432 9 hours ago [-]
I've been on a mac for ~4 years now.
It was a bit of a struggle to get used to it, coming from windows.
The only thing I really miss now is alt-tab working as expected. (It's a massive pain to move between two windows of the same program)
WXLCKNO 9 hours ago [-]
You know you can use CMD+backtick (CMD+`) to cycle between windows of the same app? Add shift to go in reverse.
Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
metabagel 5 hours ago [-]
This usually doesn't work for me.
For example, if I open a new Firefox window, the Mac seems to force the two Firefox windows onto different desktops. This already is a struggle, because sometimes I don't want the windows to be on two desktops. I find that if I try to move one window to the same desktop as the other, then Mac will move the other desktop to the original desktop so they are both still on different desktops.
OK, got sidetracked there on a different annoyance, but on top of the above, CMD-backtick doesn't usually work for me, and I attribute it to the windows typically being forced onto different desktops. Some of the constraints for using a Mac are truly a mystery to me, although I'm determined to master it eventually. It shouldn't be this difficult though. For sure, Mac is nowhere near as intuitive as it's made out to be.
godelski 4 hours ago [-]
> two Firefox windows onto different desktops
My favorite is how it'll force move your workspace if you get a popup.
To reproduce, get a second monitor, throw your web browser onto that second monitor (not in full screen), and then open a application into full screen on your laptop's screen (I frequently have a terminal there). Then go to a site that gives you a popup for OAuth or a Security Key (e.g. GitHub, Amazon, Claude, you got a million options here). Watch as you get a jarring motion on the screen you aren't looking at, have to finish your login, and then move back to where you were.
> Mac are truly a mystery to me
Everyone tells me how pretty and intuitive they are yet despite being on one for years I have not become used to them. It is amazing how many dumb and simple little problems there are that arise out of normal behavior like connecting a monitor. Like what brilliant engineer decided that it was a good idea to not allow certain resolutions despite the monitor... being that resolution? Or all the flipping back and forth. It's like they looked at the KDE workspaces and were like "Let's do that, but make it jarring and not actually have programs stay in their windows". I thought Apple cared about design and aesthetics but even as a Linux user I find these quite ugly and unintuitive.
arccy 3 hours ago [-]
I'm truly annoyed at it reordering the desktops even when i have just a single screen (the built in one). I expect my programs to be in certain order, so switching between them is predictable.
Or sometimes it just decided to open a link in a new chrome window instead of just opening a tab.... and not even consistently.
dapperdrake 5 hours ago [-]
Full-screen windows (little green button at the top) seem to get "their own desktop". Has tripped me up a few times.
godelski 4 hours ago [-]
Or just put a program onto a second monitor then open a second window for that program. Usually it will not open in the same monitor. This is especially fun when you get pop-ups in browsers...
ilikepi 8 hours ago [-]
> Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
If you have an Apple keyboard, CTRL-F3 (without the Fn modifier) will do the same. Not sure if there are third-party keyboards that support Mac media keys, but I'm guessing there are some at least...
roryirvine 8 hours ago [-]
That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
GNOME does this much better, as it instead uses Super+<whatever the key above Tab is>. In the US, that remains ` but elsewhere it's so much better than on MacOS.
KORraN 6 hours ago [-]
> That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
That's true, hence why I remap it to a "proper" key, above Tab with:
Hey guys, Linux is rubbish because you have to default to the command line just to get it to work properly...
eitland 7 hours ago [-]
Only sometimes it doesn't work. (For me on a Norwegian keyboard it is CMD+<)
Specifically, sometimes it works with my Safari windows ans sometimes it doesn't.
And sometimes when it doesn't work, Option+< will work for some reason.
But sometimes that doesn't work either and then I just have to swipe and slide or use alt-tab (yes, you can now install a program that gives you proper alt-tab, so I do not have to deal with this IMO nonsense, it just feels like the right thing to do when I know I'm just looking for the other Safari window.)
I'm not complaining, I knew what I went to when I asked $WORK for a Mac, I have had one before and for me the tradeoff of having a laptop supported by IT and with good battery time is worth it even if the UX is (again IMO) somewhat crazy for a guy who comes from a C64->Win 3.1->Windows 95/98->Linux (all of them and a number of weird desktops) background.
Apple really doesn't tell power-users about a lot of these features. You can really gain a lot by searching for Mac shortcuts and tricks. I still learn new things that have been around for over a decade.
robenkleene 8 hours ago [-]
I'd argue if you need to be told about keyboard shortcuts, then you're not a power user. (I.e., knowing how to find keyboard shortcuts I'd consider a core trait of power users).
metabagel 5 hours ago [-]
Keyboard shortcuts should be exposed in some fashion. IMO, Microsoft is typically better at this.
robenkleene 5 hours ago [-]
What specifically does Microsoft do that Apple should do?
xpe 52 minutes ago [-]
Another tip: lots of useful characters are only an option press away. You can find them by viewing your keyboard [1], which is easy if you have you input source on your dock. Some of my favorites:
⌥k = ˚ (degree) ⌥e a = á
⌥p = π (pi) ⌥e e = é
⌥5 = ∞ (infinity) ⌥e i = í
⌥d = ∂ (delta) ⌥e o = ó
⌥8 = • (bullet) ⌥e u = ú
⇧⌥9 = · (middot) ⌥n n = ñ
Well I have some good news, look up the AltTab MacOS tool. It does exactly what you're after and it has been a life saver for me.
debunn 8 hours ago [-]
As a hybrid macOS / Windows user (with 20+ years of Windows keyboard muscle memory), I found Karabiner Elements a godsend. You can import complex key modifications from community built scripts which will automatically remap things like Cmd+Tab to switch windows, as well as a number of other Windows hotkeys to MacOS equivalents (link below):
I don't think I could survive on MacOS without AltTab.
afandian 2 hours ago [-]
I’ve not used windows since XP, but the one thing I missed was the keyboard menu navigation with alt and the underlined single letters. Still in my muscle memory, and I always felt like it was at least as good as keyboard shortcuts.
91bananas 9 hours ago [-]
Play with your keyboard, alt, ctrl, cmd + tab or ~ or combos of those will do wild things for ya
whycome 9 hours ago [-]
Apparently i'm the only one who didn't know about cmd+`
xpe 30 minutes ago [-]
Poking around System Settings > Keyboard > Keyboard Shortcuts… > Keyboard is a pretty good place to browse if you haven't. There are default keybindings that can changed and switched on/off. For macOS Sequoia 15.6:
[ ] Change the way Tab moves focus ⌃F7
[ ] Turn keyboard access on or off ⌃F1
[ ] Move focus on the menu bar ⌃F2
[ ] Move focus on the Dock ⌃F3
[ ] Move focus to the active or next window ⌃F4
[ ] Move focus to the window toolbar ⌃F5
[ ] Move focus to the floating window ⌃F6
[*] Move focus to next window ⌘`
[ ] Move focus to status menus ⌃F8
[ ] Show contextual menu ⌃↩
I only have one checked currently; I'm not feeling adventurous.
sator-arepo 9 hours ago [-]
You know about Cmd+Backtick, do you?
wiseowise 9 hours ago [-]
CMD + ~.
neya 3 hours ago [-]
Wow, this account's recent comment history is just full-on blasting with pro-apple opinions and attacking anyone who posts even a tinge of negativity about Apple or its recent product(s). I find it amusing we'd become so defensive about for-profit companies and their products..
godelski 4 hours ago [-]
> Sounds more like a you problem
I'm sorry, I just really hate this Apple Fanboy rhetoric. It's frequent and infuriating. Don't get me wrong, I hate it when the linux people do it too, but they tend to tell you how to get shit done while being mean.
The biggest problem with Linux is poor interfaces[0] but the biggest problem with Apple is handcuffs. And honestly, I do not find Apple interfaces intuitive. Linux interfaces and structure, I get, even if the barrier to entry is a big higher, there's lots of documentation. Apple less so. But also with Apple there's just things that are needlessly complex, buried under multiple different locations, and inconsistent.
But I said the biggest problem is handcuffs. So let me give a very dumb example. How do you merge identical contacts? Here's the official answer[1]
Either:
1) Card > Look for Duplicates
2) Select the duplicate cards, then Card > Merge Selected Cards.
Well guess what? #2 isn't an option! I believe this option only appears if you have two contacts that are in the same address book. Otherwise you have the option "Link Selected Cards". Something that isn't clear since the card doesn't tell you what account it is coming from and clicking "Find duplicates" won't offer this suggestion to you. There's dozens of issues like this where you can be right that I'm "holding it wrong", but that just means the interface isn't intuitive. You can try this one out. You can try this out. Go to your contacts, select "All Contacts" and then by clicking any random one try to figure out which address book that contact is from. It will not tell you unless you have linked contacts. And that's the idiocracy of Apple. Everything works smoothly[2] when you've always been on Apple and only use Apple but is painful to even figure out what the problem even is if you have one. The docs are horrendous. The options in the menu bar change and inconsistently disappear or gray out, leading to "where the fuck is that button?".
So yeah, maybe a lot of this is due to unfamiliarity, but it's not like they are making it easy. With Apple, it is "Do things the Apple way, or not at all". But with Linux it is "sure whatever you say ¯\_(ツ)_/¯". If my Android phone is not displaying/silencing calls people go "weird, have you tried adjusting X settings?" But if my iPhone is not displaying/silencing calls an Apple person goes "well my watch tells me when someone is calling" and they do not understand how infuriating such an answer is. Yet, it is the norm.
I really do want to love Apple. They make beautiful machines. But it is really hard to love something that is constantly punching you in the face. Linux will laugh when you fall on your face, but it doesn't actively try to take a swing or put up roadblocks. There's a big difference.
[0] But there's been a big push the last few years to fix this and things have come a long way. It definitely helps that Microsoft and Apple are deteriorating, so thanks for lowering the bar :)
> With Apple, it is "Do things the Apple way, or not at all".
Well kinda, you don't have to use all that much Apple software on macs though. If you can live with the window manager / desktop environment then you can use whichever apps you choose for pretty anything else.
klardotsh 2 hours ago [-]
Which would be less of a problem if the window manager and desktop environment weren’t some of the absolute worst parts of the entire OS.
godelski 3 hours ago [-]
I'm not sure this is true, especially if you're a "power user"[0]. Here's an example: I want to modify `~/.ssh/config` to define a machine's alias depending on the SSID I'm on. So I want this logic
If on MyHomeSSID:
Host FooComputer
Hostname 192.168.1.123
Else If tailscale-is-running
Host FooComputer
Hostname 100.64.0.123
The reason you might want to do this is so that you can have your ssh connection adapt to the network you're using. You can just always write `ssh FooComputer` and get the connection you want. This can get much more complicated[1], but is incredibly useful.
How would you accomplish this? Well actually, I don't know ANYMORE[2]. The linked thread had a solution that worked, but `ipconfig getsummary en0` now redacts the SSID (even when running sudo!). Though `system_profiler SPAirPortDataType` still works and I can get the result in 4 seconds... So not actually a solution. Yet it shows the idiocracy and inconsistency of Apple's tooling. There was a solution, then Apple changed it. wtallis helped me find a different solution, and well... then Apple changed it. YET `system_profiler` still doesn't redact the SSID so what is going on? Why is it even redacted in the first place? I can just throw my cursor up to the top right of the screen and see the SSID information. If it was a security issue then I should not be able to view that information in GUI OR CLI and it would be a big concern if I could see it in some unprivileged programs but not in others.
And that's the problem with Apple. If I write some script to do some job, I don't know if that script is going to work in 6mo because some person decided they didn't want that feature. So I can find some other command to do the exact same thing and end up playing a game of Wack-a-mole. *It is absolutely infuriating.* This is what I mean by "constantly punching you in the face". The machine fights you and that's not okay.
[0] I put in quotes because the example I'm about to give is to some "complex" but others "dead simple". I'd actually say the latter is true
[side note] I've used a similar SSID trick to write myself a "phone home" program in termux for Android and other machines. I can get my GPS coordinates and other information there so you can just write a <50 line program to ping a trusted machine if your device doesn't check in to trusted locations within certain timeframes. Sure, there's FindMy, but does that give me a history? I can't set an easing function to track if my device is on the move. Can I remote into the lost machine? Can I get it to take pictures or audio to help me locate it? Can I force on tailscale or some other means for me to get in without the other person also having technical knowledge? Why not just have a backup method in case one fails? I'm just trying to give this as an example of something that has clear utility and is normally simple to write.
thewebguyd 8 hours ago [-]
> The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager, you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
For using the vanilla macOS workspaces though, if you avoid using full screen apps (since those go to their on ephemeral workspace that you can't keybind for some stupid reason), if you create a fixed amount of workspaces you can bind keyboard shortcuts to switch to them. I have 5 set up, and use Ctrl+1/2/3/4/5 to switch between isntead of using gestures.
Apart from that, I use Raycast to set keybindings for opening specific applications. You can also bind apple shortcuts that you make.
Still not my favorite OS over Linux, but I've managed to make it work because I love the hardware, and outside of $dayjob I do professional photography and the adobe suite runs better here than even my insanely overspeced gaming machine on Windows.
nextos 7 hours ago [-]
Mac laptop hardware is objectively better, but I am on the same camp as the parent post. For most development workflows, Linux is my favorite option. In particular, I think NixOS and the convenience of x86_64 is usually worth the energy efficiency deficit with Apple M.
It will be interesting to see how this evolves as local LLMs become mainstream and support for local hardware matures. Perhaps, the energy efficiency of the Apple Neural Engine will widen the moat, or perhaps NPUs like those in Ryzen chips will close the gap.
sanswork 6 hours ago [-]
I develop using a MacBook because I like the hardware and non-development apps but all my accrual work happens on a Linux server I connect to. It's a good mix.
foxandmouse 20 minutes ago [-]
Thanks for sharing Aerospace, can’t believe I overlooked it! It’s like finding out someone fixed half the things that make macOS feel like a beautiful prison .Somehow it makes the whole OS feel less… Apple-managed.
devilsdata 3 hours ago [-]
I have a pretty good cross-platform dotfiles setup for both Mac OS and Linux that I use Chezmoi to provision. I try not to repeat myself as much as possible.
I use Linux at work and for gaming, and Mac OS for personal stuff. They both build from the same dotfiles repository.
Some things I've learned is:
- Manually set Mac's XDG paths to be equal to your Linux ones. It's much less hassle than using the default system ones.
- See my .profile as an example on how I do this: https://github.com/lkdm/dotfiles/blob/main/dot_profile.tmpl
- Use Homebrew on both Linux and Mac OS for your CLI tools
- Add Mac OS specific $PATH locations /bin, /usr/sbin, /sbin
- Do NOT use Docker Desktop. It's terrible. Use the CLI version, or use the OrbStack GUI application if you must.
- If you use iCloud, make a Zsh alias for the iCloud Drive base directory
- Mac OS ships with outdated bash and git. If you use bash scripts with `#!/usr/bin/env bash`, you should install a newer version of bash with brew, and make sure Homebrew's opt path comes before the system one, so the new bash is prioritised.
I hope this is helpful to you, so feel free to ask me anything about how I set up my dotfiles.
radley 8 hours ago [-]
Is there a HN bingo card? Because we always get a top comment for Linux user who tries Mac and decides they prefer Linux.
disgruntledphd2 7 hours ago [-]
I kinda agree with the OP, but then I was a Linux user for well over a decade. I do think that C/C++ libraries are much, much more of a pain on Mac as soon as you go off the beaten path (compiling GDAL was not pleasant, whereas it would be a breeze on Linux).
Some of this is probably brew not being as useful as apt, and some more of it is probably me not being as familiar with the Mac stuff, but it's definitely something I noticed when I switched.
The overall (graphical) UI is much fluider and more convenient than Linux though.
tannhaeuser 6 hours ago [-]
I have to agree. The loss of sense of reality among Linux fanboys is really annoying.
I had been a Linux notebook user for many years and have praised it on this board years ago. But today the Linux desktop has regressed into a piece of trash even for basic command line usage while providing zero exclusive apps worth using. It's really sad since it's unforced and brought upon Linux users by overzealous developers alone.
Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation. Available PC notebook HW is a laughable value compared to even an entry level Apple MacBook Air. Anecdata but I have no less than five "pro" notebooks (Dell Lattitude, XPS, and Lenovo Thinkpad) come and go with basic battery problems, mechanical touchpad problems, touchpad driver issues, WLAN driver issues, power management issues, gross design issues, and all kind of crap come and go in the last five years so I'm pretty sure I know what I'm talking about.
The one thing Mac isn't great for is games, and I think SteamOS/Proton/wine comes along nicely and timely as Windows is finally turning to the dark side entirely.
metabagel 5 hours ago [-]
> Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation.
performance - I don't agree
battery life - absolutely
apps - absolutely
usability - I don't agree
innovation - I don't agree
One significant annoyance associated with Linux on a laptop is that configuring suspend-then-hibernate is an arduous task, whereas it just works on a Macbook.
But, the main thing is commercial application support.
pas 4 hours ago [-]
hardware and basic OS reliability is really good when it comes to MBPs, unfortunately the rest of the software stack is maddening
delbronski 8 hours ago [-]
As a long term Mac user who works on ROS a lot I hear you. Most people here think local dev means developing a React app. Outside of mainstream web frameworks Mac sucks for local dev.
lavp 7 hours ago [-]
I’ve found Macs to be good for most dev stuff with the exception of non-Dockerized C++.
Unfortunately I do a lot of C++… I hate the hoops you have to go through to not use the Apple Clang compiler.
saltcured 7 hours ago [-]
I can relate. I've spent almost 30 years working primarily on Linux. I moved Windows to be under VM when I needed it around for occasionally using MS Office, first under vmware and later under kvm. Now I don't even use it as a VM, since work has Office 365.
My work got me a similar M4 MacBook Pro early this year, and I find the friction high enough that I rarely use it. It is, at best, an annoying SSH over VPN client that runs the endpoint-management tools my IT group wants. Otherwise, it is a paperweight since it adds nothing for me.
The rest of the time, I continue to use Fedora on my last gen Thinkpad P14s (AMD Ryzen 7 PRO 7840U). Or even my 5+ year old Thinkpad T495 (AMD Ryzen 7 PRO 3700U), though I can only use it for scratch stuff since it has a sporadic "fan error" that will prevent boot when it happens.
But, I'm not doing any local work that is really GPU dependent. If I were, I'd be torn between chasing the latest AMD iGPU that can use large (but lower bandwidth) system RAM versus rekindling my old workstation habit to host a full size graphics card. It would depend on the details of what I needed to run. I don't really like the NVIDIA driver experience on Linux, but have worked with it in the past (when I had a current gen Titan X) but also did OpenCL on several vendors.
kristianp 4 hours ago [-]
Speaking of the P14s, I have an Intel version from 2 years back and battery life is poor. And I hunger for the mac's screen for occasional photography. The other thing I found difficult is that there's no equivalent of the X1 Carbon with an AMD chip. It's Intel only. The P14s is so much heavier.
woodruffw 44 minutes ago [-]
I'm sympathetic to all of this except the part about DynamoRIO: I've barely seen people compile DynamoRIO successfully on Windows and Linux, so struggles on macOS don't seem that unusual. It seems like a marginal case to ding the Mac on.
(I have a handful of patches in DynamoRIO.)
1-more 10 hours ago [-]
> I still don't have it fully set up
Highly recommend doing nix + nix-darwin + home-manager to make this declarative. Easier to futz around with.
unshavedyak 10 hours ago [-]
Seconded. I have a mostly CLI setup and in my experience Nix favors that on Mac, but nonetheless it makes my Nix and Linux setups a breeze. Everything is in sync, love it.
Though if you don't like Nixlang it will of course be a chore to learn/etc. It was for me.
1-more 9 hours ago [-]
drop $20/month for any LLM and you don't even have to learn it!
mauflows 6 hours ago [-]
LLMs are uniquely bad at writing nix configs I've found. The top models all regularly hallucinate options
Really useful for debugging though
1-more 5 hours ago [-]
Really? This surprises me. I've used them for projects and for my home-manager setup and it's always been amazing at it. The best example I can come up with is packaging a font I needed into a nix package for a LaTeX file. It would have taken me a month of trying various smaller projects to know how to do that.
unshavedyak 8 hours ago [-]
Honestly it helped quite a bit. There are a lot of obscure (imo) errors in Nix that LLMs spot pretty quickly. I made quite a bit of progress since using them for this.
1-more 8 hours ago [-]
Yeah it's like 99% responsible for all of my flake files; I wasn't being facetious!
root_axis 7 hours ago [-]
Yes, macOS sucks compared to Linux, but the m chip gets absolutely incredible battery life, whereas the framework gets terrible battery life. I still use my framework at work though.
ElijahLynn 6 hours ago [-]
Yes, there is a dilemma in the Linux space. But is running Linux on a MacBook a viable option these days? Is Ashahi Linux solid enough?
I much prefer a framework and the repairability aspect. However, if it's going to sound like a jet engine and have half the battery life of a new m series Mac. Then I feel like there's really no option if I want solid battery life and good performance.
Mac has done a great job here. Kudos to you, Mac team!
zdragnar 3 hours ago [-]
If you don't need a dedicated graphics card, there's plenty of laptops that get 12 or better hours of battery life (8 under heavy load such as you're compiling things), which is perfectly fine for me. LG gram was my most recent one I was using, and that required zero tweaks to any power management or battery or ssd or any other settings to get.
jay_kyburz 6 hours ago [-]
Oh, I have that tab still open from when I was reading the other thread.
Here is the feature support from Asahi. Still a way to go unless you are on an old M1 looks like?
> Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
This seems like a very unfair complaint. macOS is not Linux. Its shell environment is based on Darwin which is distantly related to BSD. It has no connection to Linux, except for its UNIX certification.
yoavm 2 hours ago [-]
Why is it unfair? The OP literally stated "To any Linux users". They aren't saying it's worse, just that if you're coming from Linux it can be hard to adapt. Sounds reasonable to me.
As a Linux user, I sometimes dream about the Apple hardware, and I tell myself "How hard can it be to get used to MacOS?! It has a shell after all!". The OP reminded me that it can be quite difficult.
Philpax 2 hours ago [-]
That can be true while still being a genuine irritant. Windows and POSIX shells are different enough that you'd never assume that a script would be compatible between them - but the same is not true between your average Linux distro and macOS, which leads people to repeatedly get bit when trying to write a script that supports both.
martypitt 10 hours ago [-]
I appreciate this comment!
I'm often envious of these Macbook announcements, as the battery life on my XPS is poor (~2ish hours) when running Ubuntu. (No idea if it's also bad on Windows - as I haven't run it in years).
Thanks for the heads-up.
coldtea 9 hours ago [-]
Not that useful as a heads up.
MacOS is great for development. Tons of high profile devs, from Python and ML, to JS, Java, Go, Rust and more use it - the very people who headline major projects for those languages.
2ish hours battery life is crazy. It's 8+ hours with the average Macbook.
theodric 9 hours ago [-]
> It's 8+ hours with the average Macbook.
Did I get a dud? I rarely get over 2.5
leakycap 9 hours ago [-]
If you are on a M-series MacBook and aren't running a 3D Benchmark the entire time, your Mac is broken if it is dying after 2.5 hours.
Have you checked your Battery Health?
If you have an intel-based Mac, it's the same expected battery life as Windows and 2.5 hours on an intel MacBook battery sounds decent for something 5+ years old.
avtar 8 hours ago [-]
8+ hours sounds about right. I have a M1 Macbook Pro and even 5 years later I can still use it (a million browser tabs, couple containers, messaging apps) for an entire day without having to charge it.
tverbeure 8 hours ago [-]
Yeah, you have a dud. Or you have some processing running in the background that's gobbling up all the energy.
8 hours ago [-]
brokencode 8 hours ago [-]
With Apple Silicon? Yes, that is very low for typical dev usage.
Gaming is another story though, or any other uses that put a lot of stress the GPU.
What are the differences though? I have mbpr and a pc with Fedora on it and I barely see any differences aside from sandboxing in my atomic Kinoite setup and different package manager.
People often hating on brew but as a backend dev I haven't encountered any issues for years.
leakycap 8 hours ago [-]
The issues I see people struggle with on a Mac is that development often needs things in a non-default and often less-secure setup.
There isn't a "dev switch" in macOS, so you have to know which setting is getting in your way. Apple doesn't like to EVER show error alerts if at all possible to suppress, so when things in your dev environment fail, you don't know why.
If you're a seasoned dev, you have an idea why and can track it down. If you're learning as you go or new to things, it can be a real problem to figure out if the package/IDE/runtime you're working with is the problem or if macOS Gatekeeper or some other system protection is in the way.
ruszki 5 hours ago [-]
I can tell you in one sentence: try to have a DNS server when mDNSResponder sits on port 53 (for example because you use the new virtualization framework).
And there are a lot of such things, which are trivial or non problem in Linux.
foxandmouse 5 hours ago [-]
I thought the same thing when I saw the M5 in the news today. It’s not that I hate macOS 26, hate implies passion.. what I feel is closer to disappointment.
The problem is their philosophy. Somewhere along the way, Apple decided users should be protected from themselves. My laptop now feels like a leased car with the hood welded shut. Forget hardware upgrades, I can’t even speed up animations without disabling SIP. You shouldn’t have to jailbreak your own computer just to make it feel responsive.
Their first-party apps have taken a nosedive too. They’ve stopped being products and started being pipelines, each one a beautifully designed toll booth for a subscription. What used to feel like craftsmanship now feels like conversion-rate optimization.
I’m not anti-Apple. I just miss when their devices felt like instruments, not appliances. When you bought a Mac because it let you create, not because it let Apple curate.
Panzer04 4 hours ago [-]
Usually there's an accessibility option of some kind that disables animations; at least it exists in android and I feel like it existed in iOS (though I haven't used that in ages). I'm surprised Mac doesn't have something similar.
karmelapple 3 hours ago [-]
> To install a 3rd party window manager you need to disable some security setting
Depends what you mean by window manager, but an app like Magnet does not require disabling security settings.
I hear you. Apple hw and Linux combination would be have been great for me.
uaas 7 hours ago [-]
How about Asahi Linux, or a Fusion/Parallels VM on macOS?
JadeNB 7 hours ago [-]
Can't you literally install Linux on Apple hardware?
botanical76 7 hours ago [-]
Yes, with major tradeoffs. Asahi Linux is an amazing project, but they have not yet figured out how to get anywhere close to a Mac's power efficiency when it is running MacOS. For example, you will lose a lot of battery life[0][1] with the lid closed, whereas on MacOS you lose pretty much nothing.
I also like the multi desktop experience on KDE more, but I‘ve recently found out you can at least switch off some of the annoying behavior in the Mac settings, so that e.g it no longer switches to another desktop if you click on a dock icon that is open on another desktop
achandlerwhite 10 hours ago [-]
What are the quirks with local dev that make it not fun?
jayd16 10 hours ago [-]
There are surprisingly a lot of permission headaches and rug pulls in the last few big OS updates that have been really annoying.
konart 10 hours ago [-]
Any examples? I've been using mbpr since 2014 and haven't seen any changes recently except you have to hit "allow" button a few times.
jayd16 8 hours ago [-]
They're basically things along those lines. They're more nefarious when background services quietly error out and you need to dig to find it was a newly required permission.
EvgeniyZh 8 hours ago [-]
Launching unsigned app now requires to go to settings manually and allow it there instead of just allowing on launch
asdff 2 hours ago [-]
Right click > open no longer works?
coldtea 9 hours ago [-]
I use macOS, with all kinds of languages locally, plus vms, kubernetes, LLMs, etc and seen no such issues.
What "permission headaches"?
watermelon0 8 hours ago [-]
Launching unsigned apps is a problem, especially if an app bundle contains multiple binaries, since by default you need to approve exception for each of them separately.
I know that it's possible to script that since Homebrew handles it automatically, but if you just want to use a specific app outside of Homebrew, experience is definitely worse than on Linux/Windows.
mholm 8 hours ago [-]
There are a lot of annoying hurdles when allowing some types of application access. Needing to manually allow things in the security menu, allowing unrecognized developers, unsigned apps. Nothing insurmountable so far, but progressively more annoying for competent users to have control over their devices.
Zizizizz 9 hours ago [-]
For me, who came from linux the only thing I don't like is the overview menu's lack of an (x) to close a window. The way slack stacks windows within the app so it's hard to find the right one. Pressing the red button doesn't close the app from appearing in your CMD+Tab cycle between apps, you also have to press CMD+Q. (Just a preference to how windows and linux treat windows, actually closing them. Rectangle resolved the snap to corner thing (I know MacOS has it natively too but it's not too great in comparison).
Things I prefer: Raycast + it's plugins compared to the linux app search tooling, battery life, performance. Brew vs the linux package managers I don't notice much of a difference.
Things that are basically the same: The dev experience (just a shell and my dotfiles has it essentially the same between OS's)
tracker1 9 hours ago [-]
I think the hardest part for me, is getting used to using CMD vs CTRL for cut-copy-paste, then when I start to get used to it... in a terminal, it breaks me out with a different key for Ctrl+C. I got used to Ctrl+Shift for terminals in Linux (and Windows) for cut-copy-paste, etc.
It may seem like a small thing, but when you have literal decades of muscle memory working against you, it's not that small.
platevoltage 7 hours ago [-]
I'm a lifelong Mac user, so obviously I'm used to using CMD instead of CTRL. Inside the terminal we use CTRL for things like CTRL-C to exit a CLI application.
What messes me up when I'm working on a linux machine is not being able to do things like copy/paste text from the terminal with a hotkey combo because there is no CMD-C, and CTRL-C already has a job other than copying.
IMO apple really messed up by putting the FN key in the bottom left corner of the keyboard instead of CTRL. Those keys get swapped on every Mac I buy.
tracker1 7 hours ago [-]
Ctrl+Shift+(X,C,V) tends to work for many/most terminals in Linux and Windows (including Code and the new Terminal in Windows)...
I agree on the Fn key positioning... I hate it in the corner and tend to zoom in when considering laptops for anyone just in case. I've also had weird arrow keys on the right side in a laptop keyboard where I'd hit the up arrow instead of the right shift a lot in practice... really messed up test area input.
platevoltage 7 hours ago [-]
I knew there must be some extra hot key, but like you said, muscle memory.
It's the same thing when switching from a Nintendo to a Western game where the cancel/confirm buttons on the gamepads are swapped.
saltcured 7 hours ago [-]
As a very long-term Linux user, I'm still aggravated when implicit copy and middle-click paste doesn't just work between some apps, since it is so deeply embedded in my muscle memory!
vel0city 7 hours ago [-]
I'm only a recent MacOS user after not using it for over 20 years, so please people correct me if I'm wrong.
But in the end the biggest thing to remember is in MacOS a window is not the application. In Windows or in many Linux desktop apps, when you close the last or root window you've exited the application. This isn't true in MacOS, applications can continue running even if they don't currently display any windows. That's why there's the dot at the bottom under the launcher and why you can alt+tab to them still. If you alt+tab to an app without a window the menu bar changes to that app's menu bar.
I remember back to my elementary school computer lab with the teacher reminding me "be sure to actually quit the application in the menu bar before going to the next lesson, do not just close" especially due to the memory limitations at the time.
I've found once I really got that model of how applications really work in MacOS it made a good bit more sense why the behaviors are the way they are.
jtbaker 10 hours ago [-]
docker being nerfed is pretty much the only thing I can think of.
I saw the announcement, and it looks like a cool tool. But I don't think it supports docker compose specs, which a lot of my projects use for running services (like postgres) locally when developing. And doesn't seem like there is any support for kubernetes - e.g. still needs to run through Colima etc.
10 hours ago [-]
sofixa 10 hours ago [-]
Docker works very weirdly (it's a desktop application you have to install that has usage restrictions in enterprise contexts, and it's inside a VM so some things don't work), or you have to use an alternative with similar restrictions (Podman, Rancher Desktop).
The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible. There are things (e.g. installing drivers to be able to connect to ESP32 devices) that require jumping through multiple ridiculous hoops. Some things are flat out impossible. Each new OS update brings new restrictions "for your safety" that are probably good for the average consumer, but annoying for people using the device for development/related.
craigds 6 hours ago [-]
Dovker on mac has one killer feature though: bindmounts remap permissions sensibly so that uid/gid in the container is the correct value for the container rather than the same uid/gid from the host.
the workarounds on the internet are like "just build the image so that it uses the same uid you use on your host" which is batshot crazy advice.
i have no idea how people use docker on other platforms where this doesn't work properly. One of our devs has a linux host and was unable to use our dev stack and we couldn't find a workaround. Luckily he's a frontend dev and eventually just gave up using the dev stack in favour of running requestly to forward frontend from prod to his local tooling.
coldtea 9 hours ago [-]
>The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible.
You use nix or brew (or something like MacPorts).
And they are mighty fine.
You shouldn't be concerned with the built-in utilities.
astrange 8 hours ago [-]
IIRC many of the built-in tools were updated from FreeBSD in the last release, but they'd still be different from GNU.
sofixa 5 hours ago [-]
Brew is pretty terrible though. It's slow, and doesn't handle updates/versions/dependencies all that well.
I've had it make major (with breaking changes) updates to random software when asked to install something unrelated.
brew is fine. Not the best package manager, not the worst one either.
anajimi 10 hours ago [-]
I suggest trying Nix on Macos, it is very nice as a package manager but also it can be used as a way to replace Docker (at least for my needs, it works very well).
This days I don't even bother installing brew on my Mac, I only use Nix.
skinnymuch 6 hours ago [-]
Very interesting. I’m going to start using Nix it seems based off skimming how it works and can replace docker.
wingworks 6 hours ago [-]
I get the comment about Docker. Not being able to share memory with docker makes it a pain to use to run things alongside mac, unless you have mountains of ram.
mike-cardwell 5 hours ago [-]
I've been forced to use Macbooks for development at work for the past 7 years. I still strongly prefer my personal Thinkpad running Debian for development in my personal life. So don't just put it down to lack of familiarity.
kgc 5 hours ago [-]
Try Aerospace. Completely solved window management for me.
Also for dev, set up your desired environment in a native container and then just remote into it with your terminal of choice. (Personally recommend Ghostty with Zellij or Tmux)
TranquilMarmot 3 hours ago [-]
> I'll probably replace it with a framework at some point in the near future.
I kind of did the opposite. I have a first-gen Framework and really enjoy it, but WOW that thing runs scorchingly hot and loud. Too hot to put on your lap even doing basic workflows. Battery life is also horrible, maybe ~4 hours if you're doing any sort of heavy work, ~6 hours if you're just browsing the web. Did I mention it's loud? The fans spin up and they sound like a jet engine. The speaker on it is also substandard if that matters to you - it's inside the chassis and has no volume or bass.
Last year I replaced it with an M4 Pro Macbook and the difference is night and day. The Macbook stays cool, quiet, and has 10+ hour battery life doing the same sort of work. The trade-off is not being able to use Linux (yes, I know about Asahi, the tradeoffs are not worth it) but I have yet to find anything that I can't do on linux.
I also _despise_ the macOS window manager. It's so bad.
bonsai_spool 10 hours ago [-]
I recommend looking at Lima for setting up deterministic build environments on a Mac! I use it with Ansible to provision testing environments:
Hello! Yes! Writing this from my commute home using my companies M3 Pro and I hate it. I'm waiting for a new joiner so I can hand this off to a new starter who has a different brain to me.
I can write up all the details, but it's well covered on a recent linuxmatters.sh and Martin did a good job of explaining what I'm feeling: https://linuxmatters.sh/65/
lexarflash8g 3 hours ago [-]
Big fan of the podcast Late Linux Linux btw
AlexeyBrin 8 hours ago [-]
macOS has a different dev culture than Linux, but you can get pretty close if you install the Homebrew package manager. For running LLMs locally I would recommend Ollama (easy) or llama.cpp. Due to the unified memory, you should be able to run larger models than what you can run on a typical consumer grade GPU, but slower.
stevenwalton 5 hours ago [-]
> To any Linux users,
I have a Macbook Air and I pretty much use it as an ssh machine. It is definitely over priced for that, but it at least beats the annoyance of having to deal with Windows and all the Word docs I get sent or Teams meetings... (Seriously, how does Microsoft still exist?)
Since I mostly live in the terminal (ghostty) or am using the web browser I usually don't have to deal with stupid Apple decisions. Though I've found it quite painful to try to do some even basic things when I want to use my Macbook like I'd use a linux machine. Especially since the functionality can change dramatically after an update... I just don't get why they (and other companies) try to hinder power users so much. I understand we're small in numbers, but usually things don't follow flat distributions.
> I had to split all my dot files into common/Linux/Mac specific sections
There's often better ways around this. On my machine my OSX config isn't really about specifically OSX but what programs I might be running there[0]. Same goes for linux[1], which you'll see is pretty much just about CUDA and aliasing apt to nala if I'm on a Debian/Ubuntu machine (sometimes I don't get a choice).
I think what ends up being more complicated is when a program has a different name under a distro or version[2]. Though that can be sorted out by a little scripting. This definitely isn't the most efficient way to do things but I write like this so that things are easier to organize, turn on/off, or for me to try new things.
What I find more of a pain in the ass is how commands like `find`[3] and `grep` differ. But usually there are ways you can find to get them to work identically across platforms.
> Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
For me a VM set up via UTM works quite well on my Mac. Just make sure you do not virtualize x86, that kills both performance and battery life. This way I get the nice battery life and performance in a small packge but am not limited by MacOs for my development.
surfingdino 4 hours ago [-]
Can you not install Parallels and use it to run Linux under Mac OS X?
thehamkercat 9 hours ago [-]
local-development has been fine for me on m4 pro sequoia (i switched from archlinux), not much different
but i absolutely hate MacOS26, my next laptop won't be a macbook
It's a shame what they did to this awesome hardware with a crappy update
skinnymuch 6 hours ago [-]
My issue is how much I care about looks. If it’s not pretty, I have a harder time using stuff outside the CLI/TUI.
Linux is too ugly for me to use as my main device. Same with what I’ve seen of Android.
yoavm 2 hours ago [-]
In my opinion, Apple is the one doing very poorly on (software) looks recently. Liquid Glass looks like a joke. Both KDE and GNOME look better. The new Expressive Material 3, on Android, actually looks great.
godelski 6 hours ago [-]
I suggest you head over to /r/unixporn, and you'll probably be presently surprised. Contrary to popular belief, most of this stuff is not very hard to setup. Of course, there are people also showing off custom tooling and the craziest (and sometimes silliest) things they can pull off, but a beautiful interface is usually something you can do in under an hour. It is customary to include a repo with all configurations, so if you wanted to direct copy paste, you can do it much faster than that.
Unless you're talking about the look of the physical machine. Well then that's an easier fix ;)
Pretty funny how they're actively targeting M1 users with their marketing copy with this release.
Sorry you made your first gen chip so good that I don't feel the need to upgrade lol.
michelb 8 hours ago [-]
One of the problems is that I don’t notice a meaningful difference(that’s worth the money) between my M1 and my M4 workloads. (Dev/video). Obviously the rendering is faster but the OS isn’t. Tahoe makes my M2 feel like an intel mac.
Chip, memory and storage are really fast, but I’m fully convinced that the OS is crippling these machines.
snovymgodym 5 hours ago [-]
It's unfortunate that MacOS continues to get worse as the hardware gets better.
artdigital 53 minutes ago [-]
When I handed in my M1 Max MBP for repair, I bought a M3? Or M4 MBA to return when I get my MBP back, or to keep if I really notice a difference.
Used it for a week and came to the same conclusion, I felt absolutely no difference in day to day usage except that the MBA is nice and slim. And better battery.
leakycap 7 hours ago [-]
> Sorry you made your first gen chip so good that I don't feel the need to upgrade
M1 MacBooks are ~5 years old at this point, and if you've been working a laptop hard for 5 years it's often worth getting an upgrade for battery life as well as speed.
kalleboo 17 minutes ago [-]
I spilled a sugary coffee drink in my keyboard, and having it replaced under AppleCare+ meant they also replaced half the rest of the machine (battery, display etc). I think I'm set for another 5 years.
Noaidi 5 hours ago [-]
I still have 94% of my battery health left on my 2020 M1, and I have all the speed I need as a casual, avergae user.
manquer 2 hours ago [-]
What magic are you and everyone else doing ? I already have only 92% on my less than year old M4 14inch Pro.
I haven't booted up the older M1 recently to check, but I remember it was throwing replace battery warnings well before I got the upgrade and that I think that triggering below 80%.
artdigital 51 minutes ago [-]
Yes if it hits 79%, you can swap it under AppleCare+ at the Apple Store. My MBP M1 Max is on the third replacement and still great!
I use Al Dente to further optimize battery and calibrate it now and then
ahmeneeroe-v2 5 hours ago [-]
Crazy good. I only have 87% on mine and I think I got it in 2022 (M1 though)... I wonder if it's because I leave it plugged in so much.
leakycap 5 hours ago [-]
I saw it first on using a device 2 years older than your M1, so you might be in a similar boat soon - but I hope not.
The lower power and heat of M-devices might result in meaningfully longer battery life, and I'm curious to find out.
BolexNOLA 5 hours ago [-]
frankly nothing holds a candle to the battery performance of the M series machines so it’s likely a safe bet to assume that advantage will also translate into longer overall life/battery health until we see otherwise. We’ll see in a few years I suppose.
DrProtic 6 hours ago [-]
Battery power on M1 16” was so good when new that even severely degraded is still pretty good.
leakycap 6 hours ago [-]
I felt the same way about the battery in my 2018 MacBook ... it was losing capacity, but I didn't mind as it still ran for hours between charged.
Then it started having issues waking up from sleep. Only the OG Apple charger could wake it up, then it would see it actually had 40-60% battery but something had gone wrong while sleeping and it thought it was empty.
Intel MacBooks had terrible SMC issues, so maybe this won't afflict the M-series. Just sharing because I could still use that MacBook a few hours between charged, it just couldn't be trusted to wake up without a charger. That's really inconvenient and got me to upgrade combined with new features.
nektro 6 hours ago [-]
um no; that's a reason to upgrade the battery, not get a new laptop
leakycap 6 hours ago [-]
How much would you charge me to swap out my MacBook Pro 2018 15.4" battery using authorized methods to not cause other damage? I want my laptop back within a few days, a 90 day warranty on parts and labor, and I want a genuine Apple battery - not some unknown 3rd party.
nicoburns 4 hours ago [-]
For battery replacement you really want to take this to a local electronics repair shop (read the reviews so you find a good one). They'll do it same day or next day.
It'll probably be around $200-$300 if you want an official battery. More like half that if you're willing to accept a 3rd party one.
leakycap 2 hours ago [-]
Apple doesn't sell the official battery as a standalone part even to Authorized Apple repair shops, so anyone telling you they're installing an official battery is already lying to you or putting in something used.
Even if a local shop somehow sourced a legit, new Apple battery, why wouldn't I go to the Apple Store if it's the same cost and would only be the battery?
(For $299, Apple replaces the speakers, touchpad, batteries, top case, and keyboard and provides a parts and labor warranty for 90 days)
Noaidi 5 hours ago [-]
Yeah, I am using a 2020 Macbook Air M1 with macOS 15.7.1 (which I am about to install Ashasi Linux) and I have no issues as a casual user. For most people who use macbooks I see no reason to but an M5 or M4 over an M1.
BolexNOLA 5 hours ago [-]
How’s that running on M machines these days?
drcode 4 hours ago [-]
asahi runs great on m1, m2- but not on the newer chips.
tracker1 9 hours ago [-]
I got that as well.. more annoying are comparisons with the last Intel options, which sucked then.
I'm still doing fine with a 16gb M1 Air, I mostly VPN+SSH to my home desktop when I need more oomph anyway. It lasts a full day, all week when you just check email on vacation once a day.
cryptoegorophy 5 hours ago [-]
I have a 2020 Macbook, intel... 2.3 GHz Quad-Core Intel Core i7
is it worth upgrading?
KaiserPro 5 hours ago [-]
Very much so.
No fan noise, no warmth, unless you are really really pushing it.
in terms of speed, it makes it feel like the original retina did when they first came out. oh and a pretty fast disk as well.
ahmeneeroe-v2 5 hours ago [-]
>makes it feel like the original retina
Exactly right. M1 MacBook Pro delighted me in a way that Macs haven't done since my 2013 Retina MBP
celsoazevedo 5 hours ago [-]
In terms of performance, thermals, and battery life, it was a huge upgrade for me when I moved from Intel to the M1 Max. M1 Max to the M4 Max... improvements were mainly on very heavy tasks, like video transcoding with Handbrake or LLMs with Ollama.
snovymgodym 5 hours ago [-]
Yes... to any Apple Silicon machine.
kondro 8 hours ago [-]
Just give me cellular in a MacBook Air already Apple if you want me to insta-buy! Bonus points for OLED.
Air’s don’t have to be just cheap. I want a thin and light premium laptop for walking around and a second Mac (of any type) for my desk.
leakycap 8 hours ago [-]
I'm guessing carriers/networks can't handle a fleet of MacBooks-with-cellular yet. The data workload would be sustained and intense with macOS not having the type of system-level cellular framework/data control as iPad and iOS (I have used the low data mode on macOS, it helps but only handles a small part of the problem).
I have bought cracked-screen iPhones since Personal Hotspot allowed wired connections back in the 2000s, velcro'd them to the back of my MacBook screen and have been living the "I have internet on my Mac everywhere" life since then. With 5G, I can't really tell when I'm on Wi-Fi vs. when my MacBook opts for the hotspot connection.
I'd love a cellular MacBook and would also insta-buy, but I've given up hope until the next network upgrade.
kondro 4 hours ago [-]
That doesn’t make much sense to me, there are literally billions of phones that people are using all the time.
Apple has over 2.3 billion active devices of which a small percentage are Macs (an estimated 24 million were sold in 2024 and around twice that in iPads).
The most difficult to scale part of a cell network is number of devices connected, not bandwidth used anyway and cellular Macs aren’t going to add significantly more load to a network. And that assumes that Apple even cares what a carrier thinks.
I’m in Australia, not the USA, and for all people like to complain about internet here, we have excellent mobile coverage and it’s relatively affordable, but it’s all priced by usage.
I have 4 devices on my plan with shared 210GB of 4G usage between them for around AUD$200 (USD$130) a month on Australia’s best network (Telstra). I work remotely from cafes a lot (probably around 20-30 hours a week) as a developer and get nowhere close to that usage. I update all my apps, download all my podcasts, listen to lossless music and stream video whenever I want during breaks (although I’m not a huge out-of-home video consumer). I do literally nothing to limit my bandwidth usage and am lucky to use 30-40GB a month across all my devices.
xethos 2 hours ago [-]
> The most difficult to scale part of a cell network is number of devices connected, not bandwidth used
Not a network engineer, but isn't it possible that it's only wasy to scale the number of devices because mobile devices play nice with the network? For example, battery life depends on batching network requests, meaning the incentives are aligned between Google, Apple, and the carriers?
If every device defaults to treating the network like a LAN, like MacOS is accustomed to being able to do, that may change the part of the network that's easy to scale
B1FF_PSUVM 12 minutes ago [-]
> cracked-screen iPhones
Sheesh, what do you have against MiFi 4G pocket routers?
paradox460 7 hours ago [-]
Osx has, for a little while now, had a "metered" flag for networks. Not sure which apps, if any, respect it, but it's there
leakycap 7 hours ago [-]
> (I have used the low data mode on macOS, it helps but only handles a small part of the problem)
Yes, I mentioned that in the post you responded to.
> Not sure which apps, if any, respect it, but it's there
It reduces data consumption for me about 1/5. Not nothing, but the Mac can easily consume hundreds of GB of data a week doing "normal" activities. YouTube on a MacBook is many times more data than the equivalent on a phone screen.
jazzyjackson 7 hours ago [-]
I know minis don't sell well but I wish they kept the Air 11" format but without the bezel one way or another
My craving has been answered by the GPD WIN MAX 2, a 10" mini laptop with lots of ports and bitchin' performance (AI 9 chip sips battery). It's windows, but an upgrade to pro to disable the annoying stuff via group policy + never signing into a Microsoft account, it's amazing how much faster it is than a machine that's always trying to authenticate to the cloud before it does anything. Wake from sleep is also excellent which was the main thing that kept me using MacBooks. Anyway it's the first computer I've bought in a decade that has some innovation to it.
Edit: there's a slot for a cellular modem but I haven't done enough research to find one that will play nice on US networks
joshstrange 5 hours ago [-]
I've heard (but not tested) that Tahoe and iOS 26 do a _much_ better job of auto-connecting and reconnecting (if your cell drops, like going through a tunnel or similar) to make it easier to use your phone with your MBP.
I hope this is the case. I don't know if I would buy a cellular MBP (just wouldn't use it enough) but better tethering is a huge win for me.
brailsafe 4 hours ago [-]
When they finally put cellular in Macbooks, I bet they only put them in the Air for like 2 years, because we all know serious professionals are always at a desk with WiFi
apparent 7 hours ago [-]
Why carry around two cellular modems? Are you ever out and about with your computer but not your phone? I've been happy to hotspot my computers and tablets to my phone, which I always have with me.
The only possible issue I can think of is battery life, but if I'm carrying around my laptop I can throw a charge cable in the bag to keep my phone juiced.
brailsafe 4 hours ago [-]
> Why carry around two cellular modems?
Why not? If I had both with me, I'd rather just have my phone on Airplane mode preserving the battery and my focus.
> Are you ever out and about with your computer but not your phone? I've been happy to hotspot my computers and tablets to my phone, which I always have with me.
I'd really really like to be. The amount of dependence I have on the phone being there at all times is insane. I just want to leave with my laptop and be good to go, no possibility of receiving a call or getting distraced with stupid group chats.
My phone recently died spontaneously, and if I didn't replace it immediately, I can't work online from cafes or anywhere else without depending on the place having open wifi.
leakycap 7 hours ago [-]
I leave my setup plugged in, using a low-profile USB-C to lightning cable on the iPhone SE stuck on the back of my screen and wired hotspot on macOS is a great experience.
We're discussing a MacBook someday with a built-in phone, the closest I've found is an iOS device wired to my MacBook as a wired hotspot. It's like having fast wifi everywhere.
Using my personal phone (that I also use for other things like calls) wouldn't be like having wifi everywhere on my Mac, for example if I walk away from my laptop while on the phone the Mac would lose internet.
kondro 4 hours ago [-]
I want my computer to have an always on cell modem just like my phone does.
The Apple Silicon chips all run in a version of always on these days because the efficiency cores are so, well, efficient.
Additionally, while you may want to burn the battery in multiple devices and deal with having to manage that, I don’t want to.
Apple has been selling cellular iPads since the beginning and I love never having to worry about pairing mine.
Tethering to an iPhone or iPad Is much better than it used to be, but it’s still not perfect.
Apple makes their own modems these days and even with Qualcomm had a capped per device license fee more than covered by the premium they charge for cellular in, say, the iPad.
I know so many people who want this convenience and are willing to pay for it that it just seems like stubbornness at this point that they’re willing to put modems in iPads and not MacBooks.
themagician 6 hours ago [-]
The pairing has become almost flawless as well. Years ago, it was slow and inconsistent, but now the hotspot feature is almost perfect and automatic. Honestly, I don’t really think about it anymore.
abtinf 3 hours ago [-]
It’s hard to explain, but having an always-on connection on a tablet or laptop just feels like absolute magic.
You sort of have to experience it first hand.
raw_anon_1111 7 hours ago [-]
For tablets, at least with T-Mobile, for $25 you get unlimited data. You only get a limited amount of tethering data.
apparent 4 hours ago [-]
True, but don't you have to treat the device as a separate line? If the laptop had cellular I'd have another bill to pay.
raw_anon_1111 3 hours ago [-]
To be completely honest, I would be fine not having cellular data on my iPad and don’t think I would use more than 40GB of tethering ever if I took advantage if using WiFi every opportunity I could instead of not worrying about it.
It’s just one of those things that it’s convenient not having to worry about WiFi when we travel and hotel WiFi depending on how busy they are is often pretty bad.
But especially with a laptop, as often as we travel, I don’t think I’ve ever needed to tether to my Mac accept for brief periods of times when our condos shared WiFi went out (I work remotely).
I wouldn’t pay for a separate line for a computer. I am sure others would.
On another note, I did give my mom my previous iPad and kept the data plan so she doesn’t have to worry about WiFi when they take road trips.
leakycap 7 hours ago [-]
T-mobile is so scammy, though. Have you been keeping up with all the lawsuits against them in the US?
raw_anon_1111 6 hours ago [-]
All carriers are scammy in their own way
apparent 4 hours ago [-]
MVNOs FTW. They know they're competing for price-conscious consumers so have to offer more value. The big 3 know most of their customers are going to go with one of the big boys, all of whom are expensive and not great.
raw_anon_1111 4 hours ago [-]
MVNOs have slower data rates since they buy deprioritized traffic in bulk, don’t have the roaming agreements domestically and especially not internationally, and don’t offer unlimited high speed data.
apparent 1 hours ago [-]
Hm, not my experience. When traveling internationally there was an option I could have used, but I chose to use a local SIM card instead. The data speeds are just fine for me, and I haven't experienced any issues with roaming domestically.
But then I don't even care about 5g versus 4g/LTE for the most part, so perhaps I'm just not noticing limits that affect others.
raw_anon_1111 55 minutes ago [-]
That’s how MVNOs work, they buy data in bulk at wholesale prices. But they
pay for lower QoS. There isn’t anything wrong with that. But they are getting jankier bandwidth.
T-Mobile comes with 5GB of high speed data per month to use for roaming in Canada and Mexico and lower speed data roaming almost anywhere else in the world.
leakycap 6 hours ago [-]
So, maybe don't do business with the worst of them? Of the big 3 in the US, T-Mobile is the one I'd avoid right now.
raw_anon_1111 6 hours ago [-]
Verizon is overpriced and is the Comcast equivalent of cellular carriers. AT&T is about as bad
leakycap 5 hours ago [-]
Verizon owns most of the value brands in the US and you made no connection between them and Comcast beyond mentioning them in the same sentence.
AT&T is "about as bad" as what? You gave no information.
raw_anon_1111 5 hours ago [-]
Price, customer service
leakycap 5 hours ago [-]
Based on the type of responses you are giving, I actually do believe you probably call your phone company's customer service regularly. So perhaps your criteria might be different. Have you heard of Consumer Cellular?
raw_anon_1111 4 hours ago [-]
Consumer cellular rate plans are the same price as T-Mobile and they don’t have international roaming included.
leakycap 2 hours ago [-]
That joke jitterbugged right past you.
raw_anon_1111 2 hours ago [-]
You would be surprised how many people on HN will defend crappy products and servjce because of ideals - see Framework laptops, PinePhones etc.
dbg31415 2 hours ago [-]
Just out of curiosity why do you want cellular in a laptop? Just seems like another monthly service bill.
I just tether to my phone. Wouldn’t that work?
letmetweakit 11 hours ago [-]
Apparently, in Europe, the box will not contain a charger [1]. This is absolutely mind-blowing to me.
edit: suggested retail price also dropped with EUR 100. Mind is less blown now. It seems like a good thing in fact.
edit2: in Belgium, the combined price of the 70W adapter and 2m USB-C to MagSafe is EUR 120.
Removing the charger is a good move, in my opinion.
USB-C chargers are everywhere now. Monitors with USB-C or Thunderbolt inputs will charge your laptop, too. I bought a monitor that charges over the USB-C cable and I haven’t use the charger that came with the laptop in years because I have a smaller travel charger that I prefer for trips anyway.
You don’t have to buy the premium Apple charger and cable. There are many cheap options.
I already have a box of powerful USB-C chargers I don’t use. I don’t need yet another one to add to the pile.
ahmeneeroe-v2 9 hours ago [-]
The battery is good enough that I often travel with just my phone charger. I can plug the laptop in at night when the slow charge rate isn't a hindrance and be fine with the all day battery life
jazzyjackson 7 hours ago [-]
I was really surprised to find that the MacBook didn't mind charging from the 20W phone brick
dijit 4 hours ago [-]
it will even charge with the old USB-A brick.
Takes like 10 hours and isn't officially supported I think, but it does work.
xethos 2 hours ago [-]
I would expect it to be backwards-compatible - USB-A charging would be, like you said, slow as hell, but it's still standard USB
Nintendo I have no expectations for, but Apple isn't (IMO) that egregiously bad with backwards compatability
ksec 9 hours ago [-]
>USB-C chargers are everywhere now
USB-C 15W Chargers may be everywhere, but higher power charger required for MacBook Pro is not.
I would have agreed if the devices is using 10W or 20W where you could charge it slightly slower. Not for a 70W to 100W MacBook Pro though.
kmacdough 8 hours ago [-]
I've got several 50+W chargers from other devices (old mbp, soldiering iron, a generic one). If you don't have a high power charger, buy one. Easy enough. But there are plenty of use that don't need another.
al_borland 7 hours ago [-]
Every time I’ve gotten rid of an old laptop the charger goes with it. They are a package deal in my book.
I actually have very few USB-C chargers. With everyone leaving them out of the box, I don’t happen to have a bunch of them by chance. They took them out of the box before giving time for people to acquire them organically. I never bought a single lightning cable, but almost all my USB-C cables had to be purchased. This is not great, considering how confusing the USB-C spec is.
Other than the one that came with my M1 MBP (which I will lose when I sell it), I have had to purchase every charger I have.
Not being able to charge a $1,500+ laptop without buying a separate accessory is crazy to me. I’ve also seen many reports over the years comparing Apple chargers to cheap 3rd party ones where there are significant quality differences, to the point of some of the 3rd party ones being dangerous or damaging. I don’t know why Apple would want to open the door to more of that.
I assume a lot of people will use a phone charger, then call support or leave bad reviews, because the laptop is losing battery while plugged in. Most people don’t know what kind of charger they need for their laptop. My sister just ordered a MacBook Air a couple weeks ago and called me to help order, and one of the questions was about the charger, because there were options for different chargers, which confused her and had her questioning if one even came with it or if she had to pick one of the options. This is a bad user experience. She’s not a totally clueless user either. She’s not a big techie, but in offices she used to work with, she was the most knowledgeable and was who they called when the server had issues. She also opened up and did pretty major surgery on her old MacBook Air after watching a couple YouTube videos. So I’d say at least 50% of people know less than her on this stuff.
Apple positions themselves as the premium product in the market and easy to use for the average user. Not including the basics to charge the internal battery is not premium or easy. I can see it leading to reputational damage.
apparent 7 hours ago [-]
I have some 50+W chargers from old devices also. However, they are much, much bigger than the current ones. Doesn't matter for when my computers is plugged at home, but I wouldn't want to travel with it since it's easily 3x the size/weight.
leakycap 9 hours ago [-]
My MacBook M1 Pro w/ 441 cycles started doing a fun thing where if the battery gets under about 50% and you put it to sleep, the ONLY way to power on the device is to use the exact charger it came with. Higher powered Apple Studio Display PD, or even good 3rd party chargers, do not bring it back to life. This occurs even when the battery has 40-60% remaining if the laptop goes to sleep.
Had a similar issue with my 2018 MBP Intel - the 86/87 Watt Apple charger was the only thing it would come to life with as the battery aged if the device got too low.
chrismorgan 8 hours ago [-]
My dad had similar trouble with an M1 MacBook Pro that got a depleted battery. Two chargers he had wouldn’t work, but fortunately the Anki charger that I used for my laptop did work with one of the cables that I had (though not another). Once it got a little juice into it, then it was fine and he could switch back to his. But I think he was a bit more careful to avoid total depletion after that.
In 2018 I had a phone that entered a boot loop: battery depleted, plug it in, it automatically starts booting, it stops charging while booting, it dies due to depletion, it recognises it’s plugged in and starts charging, boot, stop, die, start, boot, stop, die… I tried every combination of the four or five cables that I had with a car USB power adapter and someone’s power bank, nothing worked. Diverted on my way home (an 8 hour journey) to buy another phone because I needed one the next day. When I got home, I tried two or three power adapters with all the cables and finally found one combination that worked. I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
cheema33 5 hours ago [-]
> I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
The solution is to keep your devices charged. This is feasible if you have a few devices. Not practical for someone like me. I have too many devices. I don't use every device daily.
leakycap 8 hours ago [-]
Yes, I don't often let batteries deplete but the issue I'm having on my MacBooks is that they will "die" with plenty of charge (often 40-60% left). But the computer thinks it is at 0% and won't boot past the "plug me in!" screen with anything except the OG charger from the Apple box. As soon as you connect the OG charger, it boots automatically and you see 0% battery go to 40-60% battery. At this point, you can uplug the macbook and use it - as long as you don't put it to sleep. Obviously battery/power related, but the only fix is using the charger that says/does exactly what my MacBook wants. I wonder how Apple handles this on the M5.
kmacdough 8 hours ago [-]
If it's dead or if it's low?
In my experience a low-power charger will revive, you just must wait for it to hit enough SOC since it is effectively starting off the battery. This does take a while, but starting dead on a supply that can't guarantee enough power would be dumb.
leakycap 8 hours ago [-]
My M1 Pro with 441 battery cycles won't power back on without the Apple charger it came with if I close the lid or put it to sleep and the battery isn't over 60%... something happens and the computer goes into a sleep state where the battery doesn't drain but no charger except the OG brings it back to life.
Even a Studio Display, which can provide more power than my M1 Pro can use, won't wake it from this state. Apple wants $300 for a replacement battery so I'll just buy a new MacBook at that price, but the charger situation doesn't bode well for M5 MacBook buyers who wonder why their Mac is dead one day (and they just need the exact charger the system wants, but Apple didn't provide it)
ahmeneeroe-v2 5 hours ago [-]
>Apple wants $300 for a replacement battery
Looks like iFixit shows thinks it's only a "moderate" difficulty replacement and should only cost you $109
Replacing a MacBook battery is a lot of delicate work. Not everyone has steady hands, great eyes, etc. For $300, the Apple Store is a better deal for most (and guaranteed to be a quality battery with warranty) compared to the difficulty of a $110 battery kit.
I don't want to use a 3rd party battery in a device I carry with me most places I go...
ahmeneeroe-v2 3 hours ago [-]
I agree.
I re-did the battery on my 2013 MBP well after the Apple support period (~2020). I don't think I'd try it on a still-supported Mac unless I was very price sensitive.
SkyPuncher 10 hours ago [-]
I agree. I currently have 2 or 3 brand spanking new chargers sitting in the orginal Mac box.
On the go, I've bought a small GaN with multiple ports. At home, I already have all of my desks wired up with a Usb-c charger.
semi-extrinsic 10 hours ago [-]
Adding to the anecdata, the magsafe charger for my M1 MBP has been used a grand total of two times in five years.
physicsguy 7 hours ago [-]
USB-C chargers are everywhere but definitely not at the wattage needed to drive a laptop.
apparent 7 hours ago [-]
When I buy a new laptop and sell my old one, I either have to sell the old one with the charger or keep the charger and sell the old one without. I don't actually have a bunch of spare chargers capable of charging a laptop (phone, sure).
This is especially true for someone moving up to an MBP from an MBA, which takes less juice.
yohannparis 11 hours ago [-]
I think it's awesome.
I have way to many chargers, specially USB-C, 5, 10, 20, 35, 70, and 95W all over the house and office.
If you need one, just shell the extra $100 that corresponds to your needs.
10 hours ago [-]
handsclean 8 hours ago [-]
Many people got played on the charger thing. It’s never free, it’s a mandatory bundle. But companies only put one line item on the receipt, never refer to the primary component separately but instead conflate its name and idea with the bundle, and when forced to de-bundle (usually) bump the primary component’s price to compensate, and people buy it: “the EU took away my charger!”
Chargers don’t change quickly. If I lost my charger from 2019, the ideal replacement in 2025 would be literally exactly the same model—and mine still works like new and looks good. I have nothing to gain from buying a new charger.
We should be cheering the EU for ending an abuse that the US has long failed to.
Also, it still bundles a USB-C to MagSafe 3 cable.
crazygringo 8 hours ago [-]
I mean, every laptop needs a charger.
If you sell your old laptop when you buy a new one, you generally sell it with old charger. And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
There's a reason they generally make sense to bundle. Especially with laptop chargers, which provide a whole lot more power than some random little USB-C charger you might have. Sometimes letting the free market decide actually gives customers what they want and find most useful.
handsclean 7 hours ago [-]
> If you sell your old laptop when you buy a new one, you generally sell it with old charger.
Sounds like a symptom of incompatibility. I’ve only ever included the charger when it was specific to the laptop.
> And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
Chargers automatically provide whatever power level is needed, up to their max, and charging power isn’t the steady tick upward we’re used to elsewhere. The MacBook Pro did get a faster charger a few years ago, relegating old ones to that “compatible but not optimal” state, but meanwhile MacBook Air chargers got slower, and most releases didn’t change the charger. Certainly there are sometimes benefits to buying a new charger, but it happens much less often than new device purchases, and even when there are benefits purchases should still be the customer’s choice.
> Sometimes letting the free market decide actually gives customers what they want and find most useful.
I agree, but “free market” doesn’t mean lawlessness, it means an actual market that’s actually free. Actual market: companies compete on economics, not e.g. violence or leverage over consumers. Actually free: consumers freely choose between available options. Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
crazygringo 6 hours ago [-]
> Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
Only when there's no competition and you can use that to abuse market power.
But competition for laptops is strong. Most consumers want their laptops to come with a charger, even if you personally don't. That's why they're sold that way.
Like, nobody says the free market is failing because Coke forces me to buy carbonated H2O along with their syrup at the grocery store. The market prefers it when they're bundled.
handsclean 3 hours ago [-]
I think you missed the point about bundling. In this instance, the first choice is your laptop choice, and the second is your charger choice. Because the first choice is far more important than the second, the party that’s chosen in the first can also dictate the second. There is no competition for who supplies chargers that come with MacBooks, nor should one expect there to be. This is a generalization of the mechanism by which many segment monopolies work, for example regional ISPs, where the two choices are most of your life vs your ISP.
> Most consumers want their laptops to come with a charger, even if you personally don't. That's why they're sold that way.
Citation needed, on both counts. Plenty of counter-examples in this thread. Non-tech people I know aren’t charger crazed, they’re mildly amused or annoyed by their inexplicable excess of chargers.
> Like, nobody says the free market is failing because Coke forces me to buy carbonated H2O along with their syrup at the grocery store.
I’d say it is indeed failed / nonexistent there, it’s just that nobody cares, because its potential benefit is so small it’s outweighed by overhead. Chargers aren’t laptops or cars or houses, but, as you said, there’s a lot more to them, and they’re more expensive and contribute significantly to e-waste. There actually is a charger market, and it’s better when it’s more free.
To be clear, the healthier market I’m envisioning is one where consumers can make charger purchasing decisions freely, not one where nobody’s allowed to also offer a bundle.
crazygringo 2 hours ago [-]
> Non-tech people I know aren’t charger crazed, they’re mildly amused or annoyed by their inexplicable excess of chargers.
"Charger crazed"? Huh?
They're amused by too many cheap underpowered phone and small device chargers. Not laptop chargers. Those are bigger and you don't usually have any extra.
There isn't much of a "charger market" for laptops, except people who want a second one for a second location. I've never heard of anybody with a Macbook who wanted to buy a non-Apple charger instead. And now Magsafe is back!
Like, my Macbook also bundles a keyboard, a screen, a trackpad, a battery, and so forth. Sure the charger isn't connected with adhesive, but it's still a unified product. You need a charger to use a Macbook, and most people don't have an extra laptop charger with enough power otherwise.
Forcing them to be sold separately for laptops is just silly.
lm28469 10 hours ago [-]
The real crime is that it starts at 1799 euros, which is $2100, vs $1599 in the US, I know US prices are before tax but even with 20% VAT you're far off...
StopDisinfo910 9 hours ago [-]
Apple overprices everything in the EU on top of not shipping new features. Currency risk is a thing but nowhere near the premium they charge. I personally vote with my wallet and stopped buying anything from them.
rogerrogerr 9 hours ago [-]
Who don’t you vote with your vote and oppose regs that are preventing Apple from shipping new features in the EU?
randunel 8 hours ago [-]
Those regulations don't prevent apple from shipping anything, they prevent companies from abusing their users. Apple is free to ship without abusing anyone, but explicitly choose otherwise.
masklinn 9 hours ago [-]
1599 plus a 20% VAT is 1918. That's... far from the worst it's been TBH.
chrismorgan 8 hours ago [-]
ASUS’s ZenBook Duo (UX84060) is over 50% dearer in Australia than in the USA.
When it was announced, I expected it to be at least 4000 AUD (~2600 USD). When I heard it was starting at 1500 USD instead (~2300 AUD), I was astonished and very excited. And it still is that price… but only in the US. In Australia it is 4000 AUD (the 32GB/1TB model, which is 1700 USD, ~2600 AUD). So I sadly didn’t get one.
Is the rest of the world subsidising the US market, or are they just profiteering in the rest of the world?
jiggawatts 1 hours ago [-]
Australia has actual consumer protections such as mandatory warranty repair for years on most high end electronics. This is the “real price” that includes the vendor’s estimated failure rate and cost to repair.
Americans pay the same amount, but… stochastically.
PS: Health care is similar. Australians pay a fairly predictable amount via taxes and Medicare, Americans gamble with bankruptcy every time they break a leg. But hey, if they don’t break a leg then the “system works”!
whydid 9 hours ago [-]
The EU requires a 1 year warranty on electronics, where in the US it's only 90 days. The higher cost of electronics reflects this.
> Under EU rules, if the goods you buy turn out to be faulty or do not look or work as advertised, the seller must repair or replace them at no cost. If this is impossible or the seller cannot do it within a reasonable time and without significant inconvenience to you, you are entitled to a full or partial refund. You always have the right to a minimum 2-year guarantee from the moment you received the goods. However, national rules in your country may give you extra protection.
> The 2-year guarantee period starts as soon as you receive your goods.
> If a defect becomes apparent within 1 year of delivery, you don't have to prove it existed at the time of delivery. It is assumed that it did unless the seller can prove otherwise. In some EU countries, this period of “reversed burden of proof” is 2 years.
leakycap 9 hours ago [-]
> where in the US it's only 90 days
As far as I know, the US has zero warranty laws. It can be zero days.
*) 2 year is the warranty for consumers in the EU. 1 year only for business/enterprise customers
spookie 9 hours ago [-]
They've always been like this, it's why the market share is as as low as it is vs US.
thegreatpeter 9 hours ago [-]
what happened to US tariffs? how can they be cheaper in the US than EU?
wmf 9 hours ago [-]
Apple is exempt from tariffs.
platevoltage 7 hours ago [-]
Tim Cook kissed the ring. That's all it takes.
reactordev 9 hours ago [-]
Tariffs... Someone has to pay them and it sure isn't Apple.
Moto7451 9 hours ago [-]
I think you read him backwards. It’s still cheaper in the US. Tariffs certainly exist in Europe but I’m unaware of any on these laptops and US Tariffs on goods from China don’t apply to goods from China to anywhere else that isn’t the US.
theodric 9 hours ago [-]
It should rather be "public option healthcare, social safety nets, and a robust surveillance state aren't going to pay for themselves"
MagnumOpus 9 hours ago [-]
Macbooks shipped to Europe don't ever touch US ground (and I'd wager 99.9% of their parts don't either). So US tarriffs should be irrelevant - and the EU doesn't have big China tarriffs outside of EV and solar panel anti-dumping retaliation.
kwanbix 9 hours ago [-]
Of course, you’re not wrong.
Apple could subsidize by absorbing part of the tariff in the U.S. and overcharging in the EU.
That said, in the EU we have a two-year warranty.
VAT in the U.S. is no more than 12%.
pigeonhole123 9 hours ago [-]
It's much cheaper in the US
forgotoldacc 9 hours ago [-]
I have a strong feeling Apple is raising prices elsewhere in order to avoid pissing off the notoriously sensitive consumers in America. Sony is doing similar things with making the PlayStation expensive everywhere to make it affordable for Americans. The world is essentially subsidizing the tariffs for Americans.
People in other countries will get pissed but ultimately suck it up and buy a product. People in America will take it as a personal offense due to the current Maoist-style cult of personality, and you'll get death threats and videos of them shooting your products posted onto social media. Just look at what happened to that beer company. No such thing would happen in Germany.
ahmeneeroe-v2 4 hours ago [-]
>The world is essentially subsidizing the tariffs for Americans.
I was told the opposite thing would happen. Sounds like a great deal for us Americans!
reactordev 9 hours ago [-]
> I have a strong feeling Apple is raising prices elsewhere in order to avoid pissing off the notoriously sensitive consumers in America.
Or a certain individual…
scosman 10 hours ago [-]
Don't forget the environmental impact of a smaller box. The box will probably be less than half as thick, doubling shipping efficiency. These are air freight, so the CO2 impact is not negligible.
I'll take the discount and use one of my 12 existing USB-C chargers.
leshenka 10 hours ago [-]
There are more 90W-capable USB-C chargers in my home than there are laptops. I certainly don't need another one.
Honestly I'd be fine for them to just remove the box altogether and use paper envelope like Steve Jobs did once.
coldtea 9 hours ago [-]
>Don't forget the environmental impact of a smaller box
Compared to the marginal environmental impact to source materials, build hardware and parts, assemble, ship, stock, and transport to customer each unit, the box could be 10x larger and it wouldn't make a dent.
leakycap 9 hours ago [-]
> ship ... the box could be 10x larger and it wouldn't make a dent
This is not how shipping works.
A larger box, even by 1 inch on any direction, absolutely makes a huge difference when shipping in manufacturing quantities. Let's not pretend physical volume doesn't exist just to make an argument.
10 planes flying with MacBooks == much different than 1 plane (in other words, when you 10x the size of something, as you suggest, it does actually have a huge impact)
coldtea 6 hours ago [-]
The point being made is "it's not the paper fr the box that's the issue".
A smaller box allows more to be carried. But if we go that route, it's trivial to ship them without any box and box them domestically - and that's a 2-3x volume reduction right there.
leakycap 6 hours ago [-]
> it's trivial to ship them without any box and box them domestically
Ah yeah I can't imagine any scenario where this could go wrong
Like man in the middle attacks
Replacement/fake products
... or you know, damage? Boxes provide... protection.
> it's trivial
Anytime you catch yourself thinking something is trivial, you're probably trivializing it (aka think about it more and you'll probably be able to think of a dozen more reasons packaging products is the norm)
4 hours ago [-]
hexbin010 9 hours ago [-]
Bizarrely you can only select a new adapter during the configuration if you select 24GB RAM or higher
Prices are about 65 EUR for a 70W (tested DE + CH)
The EU law states they must provide an SKU without an adapter - i.e. they're still allowed to offer one with a power adapter.
9 hours ago [-]
10 hours ago [-]
tobi_bsf 9 hours ago [-]
Not sure about that. I never use my official charger and the magsafe cable but man, how did we arrive here. Some things just belong to a laptop.
chvid 11 hours ago [-]
My mbp came with a 140w charger - which I never use.
giancarlostoro 9 hours ago [-]
> Apparently, in Europe, the box will not contain a charger [1]. This is absolutely mind-blowing to me.
Same, for a laptop??? Really? Wild. You can charge these with USB-C chargers too.
lysace 10 hours ago [-]
Base 14" MBP M5 prices without VAT or sales taxes:
Germany: 1758 USD (1512 EUR) without charger.
US: 1599 USD with 70W charger.
This feels like is an insult.
9 hours ago [-]
bombcar 10 hours ago [-]
Dropping the price was nice. They could have gotten away with a slight reduction in price, and a coupon inside to send away for a "free" charger, and then bask in the millions who couldn't be bothered to do it.
tracker1 9 hours ago [-]
What really bugs me, is the huge performance gains are against the M1 and an (5-7yo chip) Intel Mac, that from my own memory had throttling and overheating issues. While not as impressive, I'd really appreciate if they simply showed the generational gains, or actual charts against several previous generations.
I'm still pretty happy with my 16gb M1 Air, but it would be nice to know some closer to real world differences.
whynotminot 6 hours ago [-]
> I'm still pretty happy with my 16gb M1 Air, but it would be nice to know some closer to real world differences.
I’m confused — they made a comparison that is directly relevant to your situation and you don’t like it?
Most people with an M4 won’t be looking to upgrade to an M5. But for people on an M1 (like you) or even an older Intel chip, this is good information!
yed 4 hours ago [-]
I think they're trying to decide whether it's worth jumping all the way to an M5 or whether they'd rather just get an M4 or M3 at discount.
leakycap 8 hours ago [-]
You can't expect Apple to make an argument against their own chips... you're asking them to admit that they are making ~20% a year improvements when they want buyers to think it's a multiples-of-X improvement.
jonplackett 6 hours ago [-]
How did we already get to no-one being impressed by 20% better PER YEAR already.
Macs barely got faster for ages with Intel - they just got hotter and shorter on battery life.
20% per year is a doubling every 4y. That is awesome.
leakycap 5 hours ago [-]
> How did we already get to no-one being impressed by 20% better PER YEAR already.
When has 20% been impressive? When Intel to M1 happened, the jump was huge ... not 20%. I can't think of anything with a 20% jump that made waves, even outside of tech.
When I used to do user benchmarking, 20% was often the first data point where users would be able to notice something was faster.
4 minutes vs 5 minutes. That's great! Kind of expected that we'll make SOME progress, so what is the low bar... 10%? Then we should be impressed with 20?
People aren't upgrading from M1, M2, M3 in numbers... so I don't think it's just me that isn't wow'd.
bigyabai 3 hours ago [-]
> Macs barely got faster for ages with Intel
Intel chips were getting faster. It's well documented (and glaringly obvious in the i9 16") that Apple just didn't want to accommodate the full TDP. They tweaked their ACPI tables to run the chips until they hit the junction temp so they were both constantly hot and constantly throttling. Apple tweaked all of their Intel chips in this way, which was a software solution to the Apple-designed hardware simply being unable to cope with the thermal stress.
We know this because the Intel Macbook Pro chassis was only ever used to run Apple Silicon chips that were passively cooled, not Pro/Max variants. The old MBP chassis designs are so awful that Apple doesn't consider them viable for cooling ARM CPUs. I blame Ive, not Intel.
leakycap 2 hours ago [-]
> Intel chips were getting faster
Do you consider margin-of-error, single-digit gains to be worth arguing over? Intel offered 14nm for 4 years straight: Skylake, Kaby Lake, Coffee Lake, Coffee Lake Refresh—four different names, same process node, and 3-7% gains each year. Such fast.
> The old MBP chassis designs are so awful that Apple doesn't consider them viable for cooling ARM CPUs
You don't put a 15-20W chip into a thermal system built for 90W+. The old chassis wasn't "too awful" for Apple Silicon, it was completely unnecessary.
eagleinparadise 10 hours ago [-]
Anyone know when to expect the M5 Pros? I am on a base 16gb M1 and struggling hard in daily workloads. I am often running at 20gb of swap memory usage.
I don't really use local LLMs but think 32GB RAM would be good for me... but I am so ready to upgrade but trying to figure out how much longer we need to wait.
fellowniusmonk 9 hours ago [-]
First rule of mac world is get the most memory you can afford.
I got the cheapest m1 pro (the weird one they sold thats binned due to defects) with 32gb ram and everything runs awesome.
Always get the most ram you can in mac world. Running a largish local LLM model is slowish but it does run.
A mac out of memory is just a totally different machine than one with.
probably because most of the devs building the software are on the highest ram possible and there is just so much testing and optimization they dont do.
Casteil 5 hours ago [-]
Part of me misses my OG base 14" M4 Pro. The battery on that thing was absolutely phenomenal - literal 12-14+ hours of real-world use. Not so much on the 14" M1 Max (64GB) that I upgraded to after about 2 yrs.
'Real-world idle' efficiency on the newer chips is the main reason I've got the (slight) itch to upgrade, but 64GB+ MBPs certainly don't come cheap.
9 hours ago [-]
mjamesaustin 9 hours ago [-]
Rumors suggest they might be early next year, or likely by spring.
jsheard 10 hours ago [-]
There was a 6 month gap between the M4 and M4 Pro, so maybe a while.
leakycap 8 hours ago [-]
This gap makes no sense to me. I wonder if Apple is just leaning into this cycle because it's easier to make M5s than more advanced processors, so you can sell this sooner?
From a buyer's perspective, I don't like it at all.
mholm 8 hours ago [-]
Different Chip SKUs are often a TON of work. By trying to release all of them at the same time, you'll have a chip pipeline where you need tons of work, all at the same time, all in the same stages of the process. By staggering them, you spread this work out across the year.
masklinn 8 hours ago [-]
M series chips are ridiculously massive, as Apple apparently does not want to transition to chiplets, so they can’t easily compose CPU. Thus refining the process and improving yields on the smaller parts probably makes sense.
As an other example the current ultra part is the M3, and it was released early 2025, after even the M4 Pro/Max, and a good 18 months after the M3 was unveiled. We might not see an M4 Ultra until 2027.
ksec 8 hours ago [-]
No WiFi 7 and WiFi 6E only is annoying. Especially for what they are charging. And Bluetooth 5.3, Their Pro Mac are slower than their iPhone Pro.
SSD has double the speed. I guess they say this only for M5 MacBook Pro, because the previous M4 has always had slower SSD speed than M4 Pro at 3.5GB/s. So now the M5 should be at 7GB/s.
I assume no update on SDXC UHS-III.
Casteil 5 hours ago [-]
The non-pro/max chipped MBPs have always been a little 'lower spec' in several regards. There used to be a little more separation though, with the non-pro chips available only in the Air & 13" MBP, but back then people complained about Apple having 'too many models'...
I suspect the M5 Pro/Max chipped MBPs will bring some of these improvements you're looking for.
rfw300 8 hours ago [-]
What use case do you have (or anticipate having) for WiFi 7 out of curiosity?
pzmarzly 7 hours ago [-]
With NVMe NASes and 5Gbit, 8Gbit and 8Gbit FTTH available for reasonable price in many places, it's easy to saturate any WiFi connection by just downloading stuff (games, AI models, etc), backing up files, or accessing your files on NAS (and editing videos straight off NAS is recently trendy).
we went from 10 hours to 24 hours in 5 years - impressive
i wonder why they advertise gaming on the laptop, anyone plays anything Meaningful on macbooks?
mjamesaustin 9 hours ago [-]
I play absolutely everything on my M1 Macbook Pro. Through Crossover, basically every Windows game runs fine. I used to check ahead of time before buying a game, but it's so good I now kind of just assume games work.
leakycap 8 hours ago [-]
A NVIDIA 2080 graphics card from 2018 still surpasses the M5 for gaming. The M5 Pro coming early next year will likely finally catch up with the 8-year-old 2080.
I'm happy to hear your games work well for you, but it sounds like the games you're playing aren't demanding compared to modern AAA titles. Most games released in the last year or two don't run well on my 2080 test system at anything approaching decent graphics.
amaranth 6 hours ago [-]
A 2080 is about the same performance as a 5060 and every game is going to be able to run on a 5060. You might not be running it at 4K Ultra with ray tracing enabled but you should be able to run at like 1080p High or better.
Whether or not the M5 GPU is actually capable of that level of performance or whether the drivers will let it reach its potential is of course a completely different story. GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
leakycap 6 hours ago [-]
> A 2080 is about the same performance as a 5060
A 5060 outperforms a 2080 by roughly 20% on most titles, across the board, not cherry-picking for the best results. They are not about the same.
> you should be able to run at like 1080p High or better
This is disconnected from reality. 1080p low/medium, some games are playable but not enjoyable. Remember, I actually have a 2080, so I'm not just guessing.
> GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
Rich coming from someone who claims a 7 year old graphics card is "about the same" as a card which has 2.5x better RayTracing, has 3x faster DLSS, faster VRAM, and much better AI capabilities. The 2080 can't even encode/decode AV1...
abtinf 2 hours ago [-]
> 5060 outperforms a 2080 by roughly 20%
Is this a typo? I’m surprised the difference is so small after 3 generations.
AstroBen 5 hours ago [-]
A reminder that the majority of what people actually play isn't "modern AAA titles": https://steamcharts.com/top
leakycap 5 hours ago [-]
> I'm happy to hear your games work well for you, but it sounds like the games you're playing aren't demanding compared to modern AAA titles.
That's why I made the specific distinction in the comment you're responding to
When a $599 Windows laptop with a 3060 can play AAA titles and your $1599 MBP can't, I wouldn't normally call that great for gaming.
asdff 2 hours ago [-]
I can confirm this is not the case. I don't use crossover but crossover is wine. I use kegworks which is also wine. It is a pain in the ass. Steam app runs like garbage over wine. Games like tf2 I got more consistent fps on my 2012 mac which can run it natively than I do on this 12/16 whatever core 4ghz beast. I used to get like 90fps pegged in that now even on low graphics configs I'm barely doing 50fps and losing frames like crazy when there's a lot of action.
Most games just flat out do not work that I've tried. Plenty will stop you with anticheat.
About the only sorts of games that actually work well over WINE on this rig (m3 pro) are ones that came out 15 years ago.
Native games like no mans sky actually got worse over time. When I first got this mac I was so impressed by the performance of nms even though its an old game at this point. I could run it entirely on ultra. Then nms put out an update and that ended, back to medium-low and no AA unless I want to experience pervasive graphical glitches like flashing purple.
Other games have some internal lock to their fps I'm not sure why even as native mac games. This is true for cities skylines. It is capped at 40fps, maybe through rosetta layer limitations? I'm not sure.
youniverse 9 hours ago [-]
I would guess they are mostly talking to game devs for now, but man in a few years if Apple can get me to throw out my windows rig that me and I imagine many others have around just for gaming I wouldn't hate that!
leakycap 8 hours ago [-]
The gaming world is so deeply ingrained with Windows technologies. Even with the GPK from Apple, I don't see the mods and patches that some Windows players enjoy.
nrjames 9 hours ago [-]
I do almost all of my gaming on an M3 MacBook Air. It’s great for games. I’ll sometimes hop on Windows for titles unavailable on the Mac, but increasingly I just skip them if they aren’t on Steam for MacOS.
leakycap 8 hours ago [-]
I get that it is good for some games, but when people say "gaming PCs" on Windows, they usually mean AAA titles. The stuff on endcaps at BestBuy for sale for PC and console. Those games won't run well on Macs unless you spend insane amounts on a Max or Ultra variant.
The M5's GPU specs seem to put it near a high-end NVIDIA card from 2018. Impressive as all get out for a power-friendly chip, but not really what I think of when I hear "good for gaming"
greenpresident 9 hours ago [-]
Baldurs Gate 3 runs great on m4 pro at 1080p (I’m on a Mac mini though).
defraudbah 9 hours ago [-]
mac mini is amazing, even though happy to hear people play games on macbooks
shepherdjerred 9 hours ago [-]
Most of the games I play (League of Legends, Civ, Factorio) work really well on my MacBook.
asdff 2 hours ago [-]
As they should. Those are like 10 year old games.
The trend I see that is more concerning is that previously mac friendly game devs have already abandoned the platform. Valve no longer maintains mac os builds of their games like cs or tf2. City skylines 1 had a first party mac release but City skylines 2 skipped mac os.
PeterCorless 7 hours ago [-]
Someone I know saw the 14" starts from $1,599 and the 16" starts at $2,499, and quipped "The most expensive 2 inches ever."
However, it is not just because of the larger display.
M5 14" starts at:
10-Core CPU
10-Core GPU
16GB Unified Memory
512GB SSD Storage
M5 16" starts at:
14-Core CPU
20-Core GPU
24GB Unified Memory
512GB SSD Storage
So it's the cost of 4x more core CPU, 10x (double) the core GPU, and +8GB memory.
DASD 6 hours ago [-]
The 16" doesn't offer the M5(yet), rather the M4 Pro and Max CPUs. Difference also is higher number of performance cores vs efficiency cores and memory bandwidth is significantly higher in the M4 lines(273 and 410 GB/s) versus the M5(153 GB/s).
ElijahLynn 6 hours ago [-]
Where did you get these specs from? The page linked in the OP says the 16 in is only available in M4.
NoPicklez 3 hours ago [-]
I'm looking to upgrade from the 2013 Macbook Pro to the Macbook Air, I noticed that the pixels per inch are lower in the latest Macbook Air, but I assume that the display is overall better?
Also, I read that the keyboard is slightly different between the Air and the Pro, I'm not a big fan of that chicklet design they released
asdff 2 hours ago [-]
If you use the macbook outside don't bother with the air. the current mbp has a brighter screen and imo it is only barely bright enough to be usable outside on max brightness. Not as bad as the 2012 era screens but nowhere close to your cellphone.
daft_pink 10 hours ago [-]
Apple’s chip release schedule is so borked. It should be High end Pro and Studio first and then iPad, Air, Mini and downgraded Pro. Why they release the iPad and Low End Pro is beyond me.
Everyone buying their high end gear is buying something waiting to be refreshed now.
Topfi 7 hours ago [-]
Hasn't that been the case throughout the industry for the last two decades now? Back when Intel was still on TikTok, the low powered laptop chips were always first, then mainstream desktop, then workstation and server roughly a year delayed. Maybe mistaken, but seems to make sense, if you mainly offer monolithic chips, you'd want to start with a smaller Die size to better leverage the process.
AMD is somewhat of an exception/unique case though, having chipsets and monolithic depending on the use case and console/semicustom offerings, so that doesn't map fully.
Also, let's not forget in Apples case, that they actually go phone first, the Air+iPad, then Pro and finally Studio. Feel that the lower end devices should priority personally though, efficiency gains are more valuable in connected devices with limited space for batteries over my 16 incher with 100wh.
Course, would be nice if we just got the entire range updated at once, but I doubt even Apple could pull such a supply chain miracle off, even if they bought all of TSMC and the entire island to boot...
Aurornis 10 hours ago [-]
> Everyone buying their high end gear is buying something waiting to be refreshed now.
Most of their buyers aren’t buying the highest end parts. Those are a niche market.
Focusing on the smaller parts first makes sense because they’re easier to validate and ship. The larger parts are more complicated and come next.
asdff 2 hours ago [-]
Apple releases a more performant chip every year. If you need you buy. Its always outdated around the corner. Why sweat it?
BarakWidawsky 9 hours ago [-]
I’m guessing this is so they optimize processor yields as manufacturing improves
Smaller chips means more of a wafer is usable when a defect exists
tracker1 9 hours ago [-]
+1 on this... it also gives them more opportunity to work out any issues when piecing the larger chips together while catching any post-production issues on the simpler, lower end parts before they hit bigger customers.
rangestransform 9 hours ago [-]
Well here’s the thing, M5 is (probably) a big A19, M1 was a big A14, and so forth. The whole thing of apple silicon is that it’s large phone chips (optimized for performance per watt) rather than small workstation chips (performance at all costs)
GeekyBear 9 hours ago [-]
A Fab ramping up a new process node is hardly a new thing.
The standard practice is to start by producing the chips with the smallest die size.
xattt 10 hours ago [-]
Is this because they know some whales that want the +1 model are going to jump at the opportunity to buy whatever is on the market, and then buy again when higher-range models are released?
leakycap 8 hours ago [-]
Apple knows their sales numbers, so I imagine is that they know the base model will sell the most quantity. Having it out now means more sales at the highest MSRP before talk of the next model release.
Buyers who walk into an Apple store for a base MacBook Pro will wait if they hear a new model is coming out. So if you have a buyer basing purchases on the generation number, it makes sense to launch that model as soon as possible.
Pro/Max buyers generally are checking into specs and getting what they need. Hence the M2 Ultra still being for sale, for some niches that have specific requirements.
Someone 9 hours ago [-]
Could be yield related. Do the non top end products run pro chips that didn’t pass testing fully, and have parts disabled?
lotsofpulp 10 hours ago [-]
…or they are still working on fine tuning/testing production of the world’s most technologically difficult thing to mass manufacture?
sroussey 10 hours ago [-]
It takes 3 months to manufacture a chip end to end. If you find a hardware bug once complete, you have to start over and you end up with a three month delay.
Looks like the Pro and Max will be on a three month delay.
astrange 8 hours ago [-]
If you find a hardware bug that late, you're not fixing it and you're looking for chicken bits to flip. It is /extremely/ expensive to validate changes and remake the masks.
reaperducer 9 hours ago [-]
Apple’s chip release schedule is so borked. It should be High end Pro and Studio first and then iPad, Air, Mini and downgraded Pro. Why they release the iPad and Low End Pro is beyond me.
People in the U.S. are starting to think about their Christmas shopping lists right about now.
jrochkind1 8 hours ago [-]
Am I remembering right that the previous 14" MacBook Pro started at $1399 (and seems to be no longer available?), so this is a $200 price increase?
(I had just been looking at macs a few weeks ago, and had noticed how close in price macbook pro and macbook air were for same specs -- was thinking, really no reason not to get pro even if all I really want it for is the built-in HDMI. They are now more price differentiated, if I am remembering right).
throwaway894345 8 hours ago [-]
That would be about a 15% increase, which is probably in the ballpark to be explained by tariffs (either existing or anticipatory)?
wait what. i had no idea. These tarrifs are a shitshow.
subarctic 6 hours ago [-]
Are there any markets where they are 15% cheaper due to not having the US's tarriffs?
throwaway894345 4 hours ago [-]
Good question, although I don't think they would price them differently because the Trump administration has openly signaled hostility toward private companies who would transparently pass on tariff costs and Apple has openly signaled subservience to the Trump administration.
asadm 11 hours ago [-]
wait I can’t have M5 with 64GB ram? highest is just 32GB which is ridiculous!
Aurornis 10 hours ago [-]
M5 Pro and M5 Max will come later with higher RAM support.
This has been their staggered release strategy for a while.
SirMaster 8 hours ago [-]
I thought the Pro and Max usually get announced at the same time and the Ultra comes later...
Casteil 5 hours ago [-]
Correct. Sometimes much later.
Still no M4 Ultra Studio available.
10 hours ago [-]
bananapub 10 hours ago [-]
they have historically had three tiers of cpus:
- normal
- pro
- max
pro and max had way more cores and gpus and supported way more ram. today's release is the basic version of the new cpu; if you want more ram you can get the m4pro or m4max based MacBook Pros, or wait for the M5pro/max to come out.
Keyframe 10 hours ago [-]
so, macbook pro m5 normal, macbook pro m5 pro, macbook pro m5 max. I see M4 still in the offer, which costs more. No wifi 7 on m5 (normal or not). 32GB these days, unified or not, in a pro machine.. I haven't paid much attention to macs over the past few years but I wonder what would Steve Jobs say about this shit.
Analemma_ 11 hours ago [-]
It seems like only the M5 base chip is available at the moment. The M5 Pro and Max, whenever those are released, will presumably have higher limits.
justinator 11 hours ago [-]
and its unified
10 hours ago [-]
Printerisreal 12 hours ago [-]
Lowest tier comes with 16 gb of memory, same memory size with lowest M. air, why Apple?
simonw 11 hours ago [-]
It looks like the highest tier is 32GB, which really surprised me. I guess we'll have to wait for the M5 Pro / M5 Max for more memory than that.
Bad news for anyone who buys the M5 MacBook Pro as an "AI" machine and finds it can't fit any of the more interesting LLMs!
xuki 11 hours ago [-]
It has always been this way. Base M1's max RAM was 16GB, M2/M3's was 24GB, M4's was 32GB.
Printerisreal 11 hours ago [-]
Base m1 was like 4-5 years ago. did we have that much ram with oldest macs too, 30 years ago? 4-5 years at same base RAM is incredibly cheap behaviour from Apple. Some phones are literally close to that RAM now.
masklinn 11 hours ago [-]
Base M1 had 8GB RAM base and 16 max. Where are you getting “same base ram”?
Printerisreal 11 hours ago [-]
%100 ram growth in 5 years still very slow and cheap.
flemhans 10 hours ago [-]
I think 8GB was also what we had in ... 2012? Or am I wrong. Memory has been going so slow.
tracker1 9 hours ago [-]
You could (unsupported) run 16gb ram on 2010 rMBP models, back before it was soldered on. Worked great, not to mention swapping the spinning drive for an SSD.
At this point, I get the soldered on ram, for better or worse... I do wish at least storage was more approachable.
wltr 9 hours ago [-]
First Retina MacBook Pros 13" were with 8 GB base memory. That was either 2013 or even 2012, so, yeah.
polshaw 9 hours ago [-]
Indeed. RAM is the tool of planned obselecence (and profiteering for that matter).
astrange 8 hours ago [-]
Get an ad blocker and then get all the people writing Java/Electron apps to fix their memory usage and you'll be good.
Exceptions apply to those running local LLMs.
ortusdux 11 hours ago [-]
They are running out of ways to differentiate their products.
10 hours ago [-]
outside1234 11 hours ago [-]
Wait what!?!? My MacBook M1 has 64GB of memory for crying out loud.
masklinn 11 hours ago [-]
M1 maxed out at 16 GB, if you have 64 it’s an M1 Max.
kgwgk 11 hours ago [-]
Wait what!?!? You may have a MacBook Pro M1 Max for crying out loud.
czbond 11 hours ago [-]
The Pro sales page says their RAM is unified, which is more efficient than traditional. Anyone have a concept by how much more efficient unified RAM performs vs RAM?
Their sales copy for reference:
"M-series chips include unified memory, which is more efficient than traditional RAM. This single pool of high-performance memory allows apps to efficiently share data between the CPU, GPU, and Neural Engine.... This means you can do more with unified memory than you could with the same amount of traditional RAM."
tracker1 9 hours ago [-]
The fact that it's unified just means it's shared between CPU/GPU usage which can be a good thing. A lot of the performance comes from more channels and a more stable distance from the CPU itself... Getting really fast performance from RAM is more difficult with a detachable interface in the middle, and longer traces.
Still not the fastest ram, that they use for dedicated GPUs, but faster than most x86 options.
masklinn 11 hours ago [-]
RAM has always been unified on every M series CPU.
thefz 10 hours ago [-]
It just means it's shared between GPU and CPU. Has its advantages in specific workloads, but dedicated super fast GPU RAM usually is better.
Everything else in this statement in marketing bullshit and Apple trying to look like they invented the wheel and discovered fire.
_joel 11 hours ago [-]
It's unified on the Air too though?
hasperdi 5 hours ago [-]
yes, all Mx processors have unified ram
masklinn 11 hours ago [-]
Because the MacBook “pro” with a base (not pro or max) M is and has always been an air with better cooling.
znkr 9 hours ago [-]
Search for memory wall. Moore’s law died a decade ago for DRAM
dvfjsdhgfv 9 hours ago [-]
Well, they could continue the 8gb joke so let's appreciate the fact that they finally switched to 16gb base models (and similarly stopped the 128GB SSD madness, these models were outdated when bought).
justonceokay 10 hours ago [-]
For perspective I have a 12 year old MacBook with 8 gigs of ram and it’s still perfectly usable for all the things I do on it. If you need more RAM because you are video encoding, compiling, or gaming (why!?) then you aren’t a basic consumer.
I’m not trying into be a fanboy and maybe it’s a little bit “cope”, but apple has always put as much RAM as is necessary for the computer to work—and not a lot more—in their base models.
michaelt 9 hours ago [-]
The $1599 M5 Macbook Pro: Good enough for a guy who thinks a 12-year-old MacBook with 8 gigs of ram is "still perfectly usable"
:)
justonceokay 9 hours ago [-]
I know it’s silly, but I think I represent over 90% of apples customers in that way. I just need something that reads emails and shows me porn.
leakycap 8 hours ago [-]
> I know it’s silly, but I think I represent over 90% of apples customers in that way
You're not silly, you're just able to see reality.
Apple knows who is buying the bulk of their computers, and it isn't power users ... most people buying computers don't have a clue what RAM is even used for.
I'd hit beachballs, but macOS balances 8GB of RAM fine even with Tahoe for regular users
eitally 8 hours ago [-]
That 90% are perfect candidates for the even cheaper Macbook Air lineup.
sings_lullabies 11 hours ago [-]
[dead]
sixothree 11 hours ago [-]
And a "pro" computer that comes with half a tb of storage by default with a $200 premium for another 0.5 tb of storage. Oof. Just gross.
I know people complain at every release. But I look at the three choices presented and they are all disappointing to me. It's a huge turnoff to see the only initial differentiator presented to be a choice between "measly" amounts of RAM and storage to "barely acceptable" amounts.
To get even close to the specs on my Surface Pro I'd have to hit the configurator and spend at least $1000. Even more to hit the config of my work issued HP notebook.
rvitorper 10 hours ago [-]
Looks premium, eco-responsible, powerful, just like the M4. And the M3. And the one before. Does anyone know what exactly changed this year?
- m4 -> m5, same core number and distribution, "neural accelerators", higher memory bandwidth
- max storage increased from 2 to 4TB (and probably an extra kidney in price)
Everything else is strictly identical.
The marketing blurb claims 2~3x the performances of the M4 for AI stuff (I assume that's the neural accelerators), 20% for CPU tasks (compiling), 40~80% for GPU tasks (gaming and 3D rendering).
eitally 8 hours ago [-]
This is great but it's apples to oranges until they have Pro & Max variants we can directly compare against the M4 line.
masklinn 8 hours ago [-]
Comparing the models which correspond exactly is apples and oranges? In what wonky alternate reality do you live?
Not to mention the M4 pro and max released 6 months after the M4. If that holds for M5, it won’t be this year.
silverwind 9 hours ago [-]
4TB was available before, maybe only coupled with a Max CPU, which seems not available yet for M5.
masklinn 8 hours ago [-]
I’m comparing comparable configurations. You can get 128GB ram and 8TB storage on an m4 max, and 512GB ram and 16 TB storage on an m3 ultra. Neither is relevant to the M5.
ActorNightly 10 hours ago [-]
The number, which means you should upgrade.
jasonthorsness 9 hours ago [-]
I've been a Windows fan forever, but the new Mac hardware is making it hard to remain and it's about time for a new laptop... can't get a good Windows installed on these chips like you could on the Intel-based ones, only virtualized.
rogerrogerr 9 hours ago [-]
Virtualized Windows on M chips is quicker than non-virtualized Windows on your average corporate laptop in my experience.
leakycap 8 hours ago [-]
ARM Windows still has so many pain points, depending on your niche.
jasonthorsness 7 hours ago [-]
Wow maybe it's worth a try
parkersweb 4 hours ago [-]
As an M1 owner seriously tempted by the hardware, seriously put off by Tahoe…
asdff 2 hours ago [-]
If only we could go back to mojave. 32bit support was in fact nice to have.
AlexeyBrin 8 hours ago [-]
At least for now, seems to be available only for the 14" MacBook Pro. I want a 16" M5 MacBook Pro so I will wait ...
gregoriol 8 hours ago [-]
Only the cheapest MacBook get M5, the rest stay with M4 Pro and M4 Max? what's going on with that lineup?
pram 8 hours ago [-]
Because there is no M5 Pro and Max yet obviously.
Casteil 5 hours ago [-]
Pro/Max rollout tends to lag behind the 'base' by about 6 months
It used to be a little less 'weird' when the base M-chips were only available in the Air and 13" MBP.
8 hours ago [-]
brailsafe 7 hours ago [-]
Probably great, but when the hell are they going to do another damn colour. Hoping by the time I upgrade from the M4 Pro they'll have a green version and a cell modem.
Casteil 5 hours ago [-]
I wish they'd bring Space Gray back. Not a huge fan of Silver, and the 'Space Black' apparently tends to show smudges more.
brailsafe 4 hours ago [-]
Meh, I have a Space Black 16" atm, it looks more grey than anything and the smudges are negligible at best. I'm just bored of the lack of variety. The Air is I guess supposed to be the quirky fun computer and the rest are just for "serious" professionals that can only see in grey and silver.
TBH, I have an M2 pro (personal) and an M4 Pro (work) and I have never been able to tell the different in day to day use. That said, the only really intensive workload I have is photography/videography batch post-processing and I haven't tested that on my work machine. I'm disappointed Apple hasn't published any benchmarks comparing M5 to various M4 (or M3) variants.
mjamesaustin 9 hours ago [-]
They've only updated the base M5. Expect the Pro and Max updates to come early next year.
cousin_it 6 hours ago [-]
I read the title as "MS MacBook Pro" and prepared for a surprise.
Ambroos 11 hours ago [-]
The lack of WiFi 7 is disappointing. 6E is fine but by now I'd expect 7 in new computers.
thewebguyd 8 hours ago [-]
Especially considering the M5 iPad pro has WiFi 7.
Sounds like maybe they didn't want to try and fit their new N1 chip this go around so they could re-use some components? MacBook still has the same broadcom chip. Or for a pro differentiating feature when the M5 Pro/Max comes out later. There's a rumored MBP re-design, so I'm guessing we'll see it then along with it having the N1 for WiFi 7.
Casteil 5 hours ago [-]
Good chance they'll introduce it with the upcoming M5 Pro/Max; the non-pro/max devices always tended to be a little lower spec all around.
justincormack 10 hours ago [-]
It is in the new ipad pro as part of the new Apple chipset, so presumably coming to other machines later.
ComputerGuru 8 hours ago [-]
Are these custom boards or just mini pcie network cards you could swap out?
leakycap 8 hours ago [-]
This is Apple: the last time they shipped a pcie/replaceable wifi card was thirteen years ago on the Mid-2012 non-Retina MacBook Pro.
Even pre-Apple Silicon, it's been a decade since users could upgrade MacBook's RAM or internal storage.
lm28469 10 hours ago [-]
homoconsomator needs bigger numbers even if he acknowledges smaller number is ok.
What do you do on wifi that requires more than 10gb per seconds... on a laptop, you'd fill up the base model ssd in under a minute of download
Ambroos 9 hours ago [-]
I run our office IT, and WiFi 7 is just better at managing congestion. We have a floor in a busy building and 5Ghz is chaos. 6E is fine, it's just strangely old for a company like Apple.
Sad to release without refreshing the high end line on 16” MBP. I worry they nerfed the 14” MBP to ensure the M4 16” retained better specs to not make the discontinuity worse. Otherwise the 14” outperforming the more expensive 16” would be uncomfortable.
richardubright 11 hours ago [-]
This is only the M4. Not the M4 Pro or Max, and the 16 has never gotten the base chip.
superzamp 11 hours ago [-]
Isn't the screen size difference basically enough between these two? I can't see why the 16" would need more performance, some ppl just want / can carry large computers with them while some other prefer to have something as small as possible.
fnordpiglet 10 hours ago [-]
It’s more a statement on price and the assumption the more expensive one with the “more capable” chip like the MAX would be expected to not be less performant than any in the lineup. It would be a disappointment, especially for me as I’m about to buy a 16” in November regardless, to be a generation behind while paying more, and it would be not unusual for product reasons to nerf the lower prices chip to ensure it didn’t canabalize the more expensive models sales.
I have to say if I had any choice I would delay my purchase until the 16” catches up rather than buying a generation behind. If I see specs saying M5 14” is more performant for my workloads than my more expensive 16” I’m even more motivated to delay. Most product managers would be aware of these things.
anthonyskipper 10 hours ago [-]
I can see why that sounds sensible, but my personal obsevations are that heavy duty power users almost universally prefer the bigger screens, and those people also want the highest level settings. Most people I know who want smaller screens are not serious power users.
I can see an overlap with people who want smaller computers who also want max power, but I just would not believe that is a significant group. (again, all personal observations)
platevoltage 7 hours ago [-]
More pixels? Thats the only reason I can think of. 13/14 inch is what I tend to go for since I use my laptop as a desktop 80% of the time. 16 is really too big for my needs.
I also think the 15 inch MacBook Air filled the non-power-user-but-likes-big-screen niche.
Aperocky 5 hours ago [-]
I find it annoying that now I want 2 macbooks, one tuned for local LLMs and the other tuned for light and big screen (Air 15).
I never really used the local LLMs since I can always go to claude but that's always on my irrational brain to spend irresponsibly for that.
jiriro 9 hours ago [-]
If I buy in the US and use it in the EU – will the Apple Intelligence work?
I would be happy to sacrifice the EU keyboard and have the AI instead :-)
ComputerGuru 8 hours ago [-]
Unless they’re now doing iPhone-level parts locking, getting the right keyboard would be an eBay purchase away.
surfingdino 4 hours ago [-]
I wish they published CPU and graphic performance comparisons vs M1. Apart from coding I do a lot of large photo (100MP) and video (4K+) file processing and I am interested in core graphics performance. Talking centipedes or flying trains are not my thing so their AI benchmarks are useless to me.
ksturtzkopf 10 hours ago [-]
why is apple releasing a MBP with the old gen of pro chips (m4 pro)?
Aurornis 10 hours ago [-]
The M5 Pro refresh will come later. M4 Pro parts are still available until then.
bitpush 10 hours ago [-]
> The M5 Pro refresh will come later.
Did they announce this or are you speaking for Apple?
Aurornis 10 hours ago [-]
They release the Pro and Max parts after the base part.
This has been their release strategy for past generations.
patates 9 hours ago [-]
Does anyone have a guess on when they could be releasing a potential M4 Ultra Mac Studio?
medvezhenok 9 hours ago [-]
I think they said M4 Ultra Studio is not going to happen, have to wait for M5 Ultra...
Casteil 5 hours ago [-]
...they're not. This is a release of a 14" with the base M5, alongside the other existing M4 Pro/Max models.
The Pro/Max rollout tends to lag behind by about 6 months.
Does it have full sized cursor keys? PgUp/PdDown/HomeEnd keys? If not, its a fashion accessory, its definately not a PROductivity machine used by PROffessionals.
Edit: okay, that garnered more attention than I expected, I guess I owe a qualification.
1. Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
2. Not everything is supported natively on arm64. I had an idea and wanted to spin up a project using DynamoRIO, but wasn't supported. Others have mentioned the docker quirks.
3. The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
So my person takeaway was that I took the openness of the Linux ecosystem for granted (I've always had a local checkout of the kernel so I can grep an error message if needed). Losing that for me felt like wearing a straightjacket. Ironically I have a MBP at work, but spend my day ssh'd into a Linux box. It's a great machine for running a web browser and terminal emulator.
The hardware was great, but life on a Mac always felt a bit convoluted. Updating the OS was especially frustrating as a software developer because of all the interdependent bits (xcode, brew, etc) that often ended up breaking my dev environment in some way. It also always amazed me at the stuff that was missing. Like, how isn't the default terminal app fully functional after all these years? On the plus side, over the time I used it they did add tiling and the ability to hide the notch.
Finally at the start of the year I moved back to Linux and couldn't be happier. Had forgotten just how nice it is to have everything I need out of the box. The big thing I miss is Affinity Photo, though that looks like it's in the middle of dying right now.
Sounds more like a you problem, probably due to unfamiliarity. There are endless options for local dev on a Mac, and a huge share of devs using one.
Yes, I know about Yabai and the other things that modify the existing window manager. The problem is the window manager itself.
Outside of the windowing system, running native Linux if you're deploying to Linux beats using an amalgamation of old BSD utils + stuff from Homebrew and hoping it works between platforms, or using VMs. The dev tools that are native to Linux are also nice.
When it comes to multiple monitors, I want a dock on each monitor. I can do that in Plasma, but I can't in macOS, unless I use some weird 3rd party software apparently.
Then you switch to macOS or Windows or even (not your) linux setup and hate it. When I manage to contain myself entirely to the terminal it's okay, but the moment I have to interact with GUI I start to miss those "just right" things.
I can relate. macOS hilariously sucks on certain GUI and terminal aspects. Not much you can do about GUI, just have to adapt to the way macOS wants to be used. For terminal, I use home-manager to manage my $HOME. It not space efficient and public caches are sub-par, but it's better than searching "sed in-place repace macos and linux cross-platform" for the 9000th time.
That just sounds like being accustomed to one way of switching tasks, honestly. If I want previews, I use Expose (three-finger swipe up/down or ctrl-up/down). But mostly I just use cmd-tab and haven't really needed to see previews there. Because macOS switches between applications, not windows, often there isn't single window to preview, and I'm not sure showing all the windows would work well either. For Expose it works well because the it can use the entire screen to show previews.
And then when you full-screen a window, switch to another app for a moment, and then you can’t find it without delving into the ‘window‘ menu.
First world problems. But daily annoyances.
https://discussions.apple.com/thread/254917813?sortBy=rank
This sounds like you think macOS is a good dev environment, but that you personally don't like the UI/UX (always safer to make UI/UX judgements subjective ["I don't like"] rather than objective ["it's bad"], since it's so difficult to evaluate objectively, e.g., compared to saying something like Docker doesn't run natively on macOS, which is just an objective fact).
I can easily develop on both, I prefer developing on Linux. Thus, it is "more good" (IMO), if you prefer.
It was a bit of a struggle to get used to it, coming from windows.
The only thing I really miss now is alt-tab working as expected. (It's a massive pain to move between two windows of the same program)
Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
For example, if I open a new Firefox window, the Mac seems to force the two Firefox windows onto different desktops. This already is a struggle, because sometimes I don't want the windows to be on two desktops. I find that if I try to move one window to the same desktop as the other, then Mac will move the other desktop to the original desktop so they are both still on different desktops.
OK, got sidetracked there on a different annoyance, but on top of the above, CMD-backtick doesn't usually work for me, and I attribute it to the windows typically being forced onto different desktops. Some of the constraints for using a Mac are truly a mystery to me, although I'm determined to master it eventually. It shouldn't be this difficult though. For sure, Mac is nowhere near as intuitive as it's made out to be.
To reproduce, get a second monitor, throw your web browser onto that second monitor (not in full screen), and then open a application into full screen on your laptop's screen (I frequently have a terminal there). Then go to a site that gives you a popup for OAuth or a Security Key (e.g. GitHub, Amazon, Claude, you got a million options here). Watch as you get a jarring motion on the screen you aren't looking at, have to finish your login, and then move back to where you were.
Everyone tells me how pretty and intuitive they are yet despite being on one for years I have not become used to them. It is amazing how many dumb and simple little problems there are that arise out of normal behavior like connecting a monitor. Like what brilliant engineer decided that it was a good idea to not allow certain resolutions despite the monitor... being that resolution? Or all the flipping back and forth. It's like they looked at the KDE workspaces and were like "Let's do that, but make it jarring and not actually have programs stay in their windows". I thought Apple cared about design and aesthetics but even as a Linux user I find these quite ugly and unintuitive.Or sometimes it just decided to open a link in a new chrome window instead of just opening a tab.... and not even consistently.
If you have an Apple keyboard, CTRL-F3 (without the Fn modifier) will do the same. Not sure if there are third-party keyboards that support Mac media keys, but I'm guessing there are some at least...
GNOME does this much better, as it instead uses Super+<whatever the key above Tab is>. In the US, that remains ` but elsewhere it's so much better than on MacOS.
That's true, hence why I remap it to a "proper" key, above Tab with:
Specifically, sometimes it works with my Safari windows ans sometimes it doesn't.
And sometimes when it doesn't work, Option+< will work for some reason.
But sometimes that doesn't work either and then I just have to swipe and slide or use alt-tab (yes, you can now install a program that gives you proper alt-tab, so I do not have to deal with this IMO nonsense, it just feels like the right thing to do when I know I'm just looking for the other Safari window.)
I'm not complaining, I knew what I went to when I asked $WORK for a Mac, I have had one before and for me the tradeoff of having a laptop supported by IT and with good battery time is worth it even if the UX is (again IMO) somewhat crazy for a guy who comes from a C64->Win 3.1->Windows 95/98->Linux (all of them and a number of weird desktops) background.
Shame on me.
[1]: https://support.apple.com/guide/mac-help/use-the-keyboard-vi...
https://karabiner-elements.pqrs.org/
https://ke-complex-modifications.pqrs.org/?q=windows#windows...
The biggest problem with Linux is poor interfaces[0] but the biggest problem with Apple is handcuffs. And honestly, I do not find Apple interfaces intuitive. Linux interfaces and structure, I get, even if the barrier to entry is a big higher, there's lots of documentation. Apple less so. But also with Apple there's just things that are needlessly complex, buried under multiple different locations, and inconsistent.
But I said the biggest problem is handcuffs. So let me give a very dumb example. How do you merge identical contacts? Here's the official answer[1]
Well guess what? #2 isn't an option! I believe this option only appears if you have two contacts that are in the same address book. Otherwise you have the option "Link Selected Cards". Something that isn't clear since the card doesn't tell you what account it is coming from and clicking "Find duplicates" won't offer this suggestion to you. There's dozens of issues like this where you can be right that I'm "holding it wrong", but that just means the interface isn't intuitive. You can try this one out. You can try this out. Go to your contacts, select "All Contacts" and then by clicking any random one try to figure out which address book that contact is from. It will not tell you unless you have linked contacts. And that's the idiocracy of Apple. Everything works smoothly[2] when you've always been on Apple and only use Apple but is painful to even figure out what the problem even is if you have one. The docs are horrendous. The options in the menu bar change and inconsistently disappear or gray out, leading to "where the fuck is that button?".So yeah, maybe a lot of this is due to unfamiliarity, but it's not like they are making it easy. With Apple, it is "Do things the Apple way, or not at all". But with Linux it is "sure whatever you say ¯\_(ツ)_/¯". If my Android phone is not displaying/silencing calls people go "weird, have you tried adjusting X settings?" But if my iPhone is not displaying/silencing calls an Apple person goes "well my watch tells me when someone is calling" and they do not understand how infuriating such an answer is. Yet, it is the norm.
I really do want to love Apple. They make beautiful machines. But it is really hard to love something that is constantly punching you in the face. Linux will laugh when you fall on your face, but it doesn't actively try to take a swing or put up roadblocks. There's a big difference.
[0] But there's been a big push the last few years to fix this and things have come a long way. It definitely helps that Microsoft and Apple are deteriorating, so thanks for lowering the bar :)
[1] https://support.apple.com/guide/contacts/merge-contact-cards...
[2] Except it actually doesn't
Well kinda, you don't have to use all that much Apple software on macs though. If you can live with the window manager / desktop environment then you can use whichever apps you choose for pretty anything else.
How would you accomplish this? Well actually, I don't know ANYMORE[2]. The linked thread had a solution that worked, but `ipconfig getsummary en0` now redacts the SSID (even when running sudo!). Though `system_profiler SPAirPortDataType` still works and I can get the result in 4 seconds... So not actually a solution. Yet it shows the idiocracy and inconsistency of Apple's tooling. There was a solution, then Apple changed it. wtallis helped me find a different solution, and well... then Apple changed it. YET `system_profiler` still doesn't redact the SSID so what is going on? Why is it even redacted in the first place? I can just throw my cursor up to the top right of the screen and see the SSID information. If it was a security issue then I should not be able to view that information in GUI OR CLI and it would be a big concern if I could see it in some unprivileged programs but not in others.
And that's the problem with Apple. If I write some script to do some job, I don't know if that script is going to work in 6mo because some person decided they didn't want that feature. So I can find some other command to do the exact same thing and end up playing a game of Wack-a-mole. *It is absolutely infuriating.* This is what I mean by "constantly punching you in the face". The machine fights you and that's not okay.
[0] I put in quotes because the example I'm about to give is to some "complex" but others "dead simple". I'd actually say the latter is true
[1] https://news.ycombinator.com/item?id=41596818
[2] https://news.ycombinator.com/item?id=41633547
[side note] I've used a similar SSID trick to write myself a "phone home" program in termux for Android and other machines. I can get my GPS coordinates and other information there so you can just write a <50 line program to ping a trusted machine if your device doesn't check in to trusted locations within certain timeframes. Sure, there's FindMy, but does that give me a history? I can't set an easing function to track if my device is on the move. Can I remote into the lost machine? Can I get it to take pictures or audio to help me locate it? Can I force on tailscale or some other means for me to get in without the other person also having technical knowledge? Why not just have a backup method in case one fails? I'm just trying to give this as an example of something that has clear utility and is normally simple to write.
Specifically for this, there's Aerospace (https://github.com/nikitabobko/AeroSpace) which does not require disabling SIP, intentionally by the dev.
For using the vanilla macOS workspaces though, if you avoid using full screen apps (since those go to their on ephemeral workspace that you can't keybind for some stupid reason), if you create a fixed amount of workspaces you can bind keyboard shortcuts to switch to them. I have 5 set up, and use Ctrl+1/2/3/4/5 to switch between isntead of using gestures.
Apart from that, I use Raycast to set keybindings for opening specific applications. You can also bind apple shortcuts that you make.
Still not my favorite OS over Linux, but I've managed to make it work because I love the hardware, and outside of $dayjob I do professional photography and the adobe suite runs better here than even my insanely overspeced gaming machine on Windows.
It will be interesting to see how this evolves as local LLMs become mainstream and support for local hardware matures. Perhaps, the energy efficiency of the Apple Neural Engine will widen the moat, or perhaps NPUs like those in Ryzen chips will close the gap.
Here's my repository: https://github.com/lkdm/dotfiles
I use Linux at work and for gaming, and Mac OS for personal stuff. They both build from the same dotfiles repository.
Some things I've learned is:
- Manually set Mac's XDG paths to be equal to your Linux ones. It's much less hassle than using the default system ones.
- Use Homebrew on both Linux and Mac OS for your CLI tools- Add Mac OS specific $PATH locations /bin, /usr/sbin, /sbin
- Do NOT use Docker Desktop. It's terrible. Use the CLI version, or use the OrbStack GUI application if you must.
- If you use iCloud, make a Zsh alias for the iCloud Drive base directory
- Mac OS ships with outdated bash and git. If you use bash scripts with `#!/usr/bin/env bash`, you should install a newer version of bash with brew, and make sure Homebrew's opt path comes before the system one, so the new bash is prioritised.
I hope this is helpful to you, so feel free to ask me anything about how I set up my dotfiles.
Some of this is probably brew not being as useful as apt, and some more of it is probably me not being as familiar with the Mac stuff, but it's definitely something I noticed when I switched.
The overall (graphical) UI is much fluider and more convenient than Linux though.
I had been a Linux notebook user for many years and have praised it on this board years ago. But today the Linux desktop has regressed into a piece of trash even for basic command line usage while providing zero exclusive apps worth using. It's really sad since it's unforced and brought upon Linux users by overzealous developers alone.
Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation. Available PC notebook HW is a laughable value compared to even an entry level Apple MacBook Air. Anecdata but I have no less than five "pro" notebooks (Dell Lattitude, XPS, and Lenovo Thinkpad) come and go with basic battery problems, mechanical touchpad problems, touchpad driver issues, WLAN driver issues, power management issues, gross design issues, and all kind of crap come and go in the last five years so I'm pretty sure I know what I'm talking about.
The one thing Mac isn't great for is games, and I think SteamOS/Proton/wine comes along nicely and timely as Windows is finally turning to the dark side entirely.
performance - I don't agree battery life - absolutely apps - absolutely usability - I don't agree innovation - I don't agree
One significant annoyance associated with Linux on a laptop is that configuring suspend-then-hibernate is an arduous task, whereas it just works on a Macbook.
But, the main thing is commercial application support.
Unfortunately I do a lot of C++… I hate the hoops you have to go through to not use the Apple Clang compiler.
My work got me a similar M4 MacBook Pro early this year, and I find the friction high enough that I rarely use it. It is, at best, an annoying SSH over VPN client that runs the endpoint-management tools my IT group wants. Otherwise, it is a paperweight since it adds nothing for me.
The rest of the time, I continue to use Fedora on my last gen Thinkpad P14s (AMD Ryzen 7 PRO 7840U). Or even my 5+ year old Thinkpad T495 (AMD Ryzen 7 PRO 3700U), though I can only use it for scratch stuff since it has a sporadic "fan error" that will prevent boot when it happens.
But, I'm not doing any local work that is really GPU dependent. If I were, I'd be torn between chasing the latest AMD iGPU that can use large (but lower bandwidth) system RAM versus rekindling my old workstation habit to host a full size graphics card. It would depend on the details of what I needed to run. I don't really like the NVIDIA driver experience on Linux, but have worked with it in the past (when I had a current gen Titan X) but also did OpenCL on several vendors.
(I have a handful of patches in DynamoRIO.)
Highly recommend doing nix + nix-darwin + home-manager to make this declarative. Easier to futz around with.
Though if you don't like Nixlang it will of course be a chore to learn/etc. It was for me.
Really useful for debugging though
I much prefer a framework and the repairability aspect. However, if it's going to sound like a jet engine and have half the battery life of a new m series Mac. Then I feel like there's really no option if I want solid battery life and good performance.
Mac has done a great job here. Kudos to you, Mac team!
https://asahilinux.org/docs/platform/feature-support/overvie...
This seems like a very unfair complaint. macOS is not Linux. Its shell environment is based on Darwin which is distantly related to BSD. It has no connection to Linux, except for its UNIX certification.
As a Linux user, I sometimes dream about the Apple hardware, and I tell myself "How hard can it be to get used to MacOS?! It has a shell after all!". The OP reminded me that it can be quite difficult.
I'm often envious of these Macbook announcements, as the battery life on my XPS is poor (~2ish hours) when running Ubuntu. (No idea if it's also bad on Windows - as I haven't run it in years).
Thanks for the heads-up.
MacOS is great for development. Tons of high profile devs, from Python and ML, to JS, Java, Go, Rust and more use it - the very people who headline major projects for those languages.
2ish hours battery life is crazy. It's 8+ hours with the average Macbook.
Did I get a dud? I rarely get over 2.5
Have you checked your Battery Health?
If you have an intel-based Mac, it's the same expected battery life as Windows and 2.5 hours on an intel MacBook battery sounds decent for something 5+ years old.
Gaming is another story though, or any other uses that put a lot of stress the GPU.
What are the differences though? I have mbpr and a pc with Fedora on it and I barely see any differences aside from sandboxing in my atomic Kinoite setup and different package manager.
People often hating on brew but as a backend dev I haven't encountered any issues for years.
There isn't a "dev switch" in macOS, so you have to know which setting is getting in your way. Apple doesn't like to EVER show error alerts if at all possible to suppress, so when things in your dev environment fail, you don't know why.
If you're a seasoned dev, you have an idea why and can track it down. If you're learning as you go or new to things, it can be a real problem to figure out if the package/IDE/runtime you're working with is the problem or if macOS Gatekeeper or some other system protection is in the way.
And there are a lot of such things, which are trivial or non problem in Linux.
The problem is their philosophy. Somewhere along the way, Apple decided users should be protected from themselves. My laptop now feels like a leased car with the hood welded shut. Forget hardware upgrades, I can’t even speed up animations without disabling SIP. You shouldn’t have to jailbreak your own computer just to make it feel responsive.
Their first-party apps have taken a nosedive too. They’ve stopped being products and started being pipelines, each one a beautifully designed toll booth for a subscription. What used to feel like craftsmanship now feels like conversion-rate optimization.
I’m not anti-Apple. I just miss when their devices felt like instruments, not appliances. When you bought a Mac because it let you create, not because it let Apple curate.
Depends what you mean by window manager, but an app like Magnet does not require disabling security settings.
https://apps.apple.com/us/app/magnet/id441258766?mt=12
Also, note that thunderbolt not yet supported[2].
[0] https://web.archive.org/web/20241219125418/https://social.tr... [1] https://github.com/AsahiLinux/linux/issues/262 [2] https://asahilinux.org/docs/platform/feature-support/overvie...
What "permission headaches"?
I know that it's possible to script that since Homebrew handles it automatically, but if you just want to use a specific app outside of Homebrew, experience is definitely worse than on Linux/Windows.
Things I prefer: Raycast + it's plugins compared to the linux app search tooling, battery life, performance. Brew vs the linux package managers I don't notice much of a difference.
Things that are basically the same: The dev experience (just a shell and my dotfiles has it essentially the same between OS's)
It may seem like a small thing, but when you have literal decades of muscle memory working against you, it's not that small.
What messes me up when I'm working on a linux machine is not being able to do things like copy/paste text from the terminal with a hotkey combo because there is no CMD-C, and CTRL-C already has a job other than copying.
IMO apple really messed up by putting the FN key in the bottom left corner of the keyboard instead of CTRL. Those keys get swapped on every Mac I buy.
I agree on the Fn key positioning... I hate it in the corner and tend to zoom in when considering laptops for anyone just in case. I've also had weird arrow keys on the right side in a laptop keyboard where I'd hit the up arrow instead of the right shift a lot in practice... really messed up test area input.
It's the same thing when switching from a Nintendo to a Western game where the cancel/confirm buttons on the gamepads are swapped.
But in the end the biggest thing to remember is in MacOS a window is not the application. In Windows or in many Linux desktop apps, when you close the last or root window you've exited the application. This isn't true in MacOS, applications can continue running even if they don't currently display any windows. That's why there's the dot at the bottom under the launcher and why you can alt+tab to them still. If you alt+tab to an app without a window the menu bar changes to that app's menu bar.
I remember back to my elementary school computer lab with the teacher reminding me "be sure to actually quit the application in the menu bar before going to the next lesson, do not just close" especially due to the memory limitations at the time.
I've found once I really got that model of how applications really work in MacOS it made a good bit more sense why the behaviors are the way they are.
https://github.com/apple/container
The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible. There are things (e.g. installing drivers to be able to connect to ESP32 devices) that require jumping through multiple ridiculous hoops. Some things are flat out impossible. Each new OS update brings new restrictions "for your safety" that are probably good for the average consumer, but annoying for people using the device for development/related.
the workarounds on the internet are like "just build the image so that it uses the same uid you use on your host" which is batshot crazy advice.
i have no idea how people use docker on other platforms where this doesn't work properly. One of our devs has a linux host and was unable to use our dev stack and we couldn't find a workaround. Luckily he's a frontend dev and eventually just gave up using the dev stack in favour of running requestly to forward frontend from prod to his local tooling.
You use nix or brew (or something like MacPorts).
And they are mighty fine.
You shouldn't be concerned with the built-in utilities.
I've had it make major (with breaking changes) updates to random software when asked to install something unrelated.
Also for dev, set up your desired environment in a native container and then just remote into it with your terminal of choice. (Personally recommend Ghostty with Zellij or Tmux)
I kind of did the opposite. I have a first-gen Framework and really enjoy it, but WOW that thing runs scorchingly hot and loud. Too hot to put on your lap even doing basic workflows. Battery life is also horrible, maybe ~4 hours if you're doing any sort of heavy work, ~6 hours if you're just browsing the web. Did I mention it's loud? The fans spin up and they sound like a jet engine. The speaker on it is also substandard if that matters to you - it's inside the chassis and has no volume or bass.
Last year I replaced it with an M4 Pro Macbook and the difference is night and day. The Macbook stays cool, quiet, and has 10+ hour battery life doing the same sort of work. The trade-off is not being able to use Linux (yes, I know about Asahi, the tradeoffs are not worth it) but I have yet to find anything that I can't do on linux.
I also _despise_ the macOS window manager. It's so bad.
https://github.com/lima-vm/lima
I can write up all the details, but it's well covered on a recent linuxmatters.sh and Martin did a good job of explaining what I'm feeling: https://linuxmatters.sh/65/
Since I mostly live in the terminal (ghostty) or am using the web browser I usually don't have to deal with stupid Apple decisions. Though I've found it quite painful to try to do some even basic things when I want to use my Macbook like I'd use a linux machine. Especially since the functionality can change dramatically after an update... I just don't get why they (and other companies) try to hinder power users so much. I understand we're small in numbers, but usually things don't follow flat distributions.
There's often better ways around this. On my machine my OSX config isn't really about specifically OSX but what programs I might be running there[0]. Same goes for linux[1], which you'll see is pretty much just about CUDA and aliasing apt to nala if I'm on a Debian/Ubuntu machine (sometimes I don't get a choice).I think what ends up being more complicated is when a program has a different name under a distro or version[2]. Though that can be sorted out by a little scripting. This definitely isn't the most efficient way to do things but I write like this so that things are easier to organize, turn on/off, or for me to try new things.
What I find more of a pain in the ass is how commands like `find`[3] and `grep` differ. But usually there are ways you can find to get them to work identically across platforms.
But yeah, I don't have a solution to this... :([0] https://github.com/stevenwalton/.dotfiles/blob/master/rc_fil...
[1] https://github.com/stevenwalton/.dotfiles/blob/master/rc_fil...
[2] https://github.com/stevenwalton/.dotfiles/blob/master/rc_fil...
[3] https://github.com/stevenwalton/.dotfiles/tree/master/rc_fil...
but i absolutely hate MacOS26, my next laptop won't be a macbook
It's a shame what they did to this awesome hardware with a crappy update
Linux is too ugly for me to use as my main device. Same with what I’ve seen of Android.
Unless you're talking about the look of the physical machine. Well then that's an easier fix ;)
https://www.reddit.com/r/unixporn/
Sorry you made your first gen chip so good that I don't feel the need to upgrade lol.
Chip, memory and storage are really fast, but I’m fully convinced that the OS is crippling these machines.
Used it for a week and came to the same conclusion, I felt absolutely no difference in day to day usage except that the MBA is nice and slim. And better battery.
M1 MacBooks are ~5 years old at this point, and if you've been working a laptop hard for 5 years it's often worth getting an upgrade for battery life as well as speed.
I haven't booted up the older M1 recently to check, but I remember it was throwing replace battery warnings well before I got the upgrade and that I think that triggering below 80%.
I use Al Dente to further optimize battery and calibrate it now and then
The lower power and heat of M-devices might result in meaningfully longer battery life, and I'm curious to find out.
Then it started having issues waking up from sleep. Only the OG Apple charger could wake it up, then it would see it actually had 40-60% battery but something had gone wrong while sleeping and it thought it was empty.
Intel MacBooks had terrible SMC issues, so maybe this won't afflict the M-series. Just sharing because I could still use that MacBook a few hours between charged, it just couldn't be trusted to wake up without a charger. That's really inconvenient and got me to upgrade combined with new features.
It'll probably be around $200-$300 if you want an official battery. More like half that if you're willing to accept a 3rd party one.
Even if a local shop somehow sourced a legit, new Apple battery, why wouldn't I go to the Apple Store if it's the same cost and would only be the battery?
(For $299, Apple replaces the speakers, touchpad, batteries, top case, and keyboard and provides a parts and labor warranty for 90 days)
I'm still doing fine with a 16gb M1 Air, I mostly VPN+SSH to my home desktop when I need more oomph anyway. It lasts a full day, all week when you just check email on vacation once a day.
No fan noise, no warmth, unless you are really really pushing it.
in terms of speed, it makes it feel like the original retina did when they first came out. oh and a pretty fast disk as well.
Exactly right. M1 MacBook Pro delighted me in a way that Macs haven't done since my 2013 Retina MBP
Air’s don’t have to be just cheap. I want a thin and light premium laptop for walking around and a second Mac (of any type) for my desk.
I have bought cracked-screen iPhones since Personal Hotspot allowed wired connections back in the 2000s, velcro'd them to the back of my MacBook screen and have been living the "I have internet on my Mac everywhere" life since then. With 5G, I can't really tell when I'm on Wi-Fi vs. when my MacBook opts for the hotspot connection.
I'd love a cellular MacBook and would also insta-buy, but I've given up hope until the next network upgrade.
Apple has over 2.3 billion active devices of which a small percentage are Macs (an estimated 24 million were sold in 2024 and around twice that in iPads).
The most difficult to scale part of a cell network is number of devices connected, not bandwidth used anyway and cellular Macs aren’t going to add significantly more load to a network. And that assumes that Apple even cares what a carrier thinks.
I’m in Australia, not the USA, and for all people like to complain about internet here, we have excellent mobile coverage and it’s relatively affordable, but it’s all priced by usage.
I have 4 devices on my plan with shared 210GB of 4G usage between them for around AUD$200 (USD$130) a month on Australia’s best network (Telstra). I work remotely from cafes a lot (probably around 20-30 hours a week) as a developer and get nowhere close to that usage. I update all my apps, download all my podcasts, listen to lossless music and stream video whenever I want during breaks (although I’m not a huge out-of-home video consumer). I do literally nothing to limit my bandwidth usage and am lucky to use 30-40GB a month across all my devices.
Not a network engineer, but isn't it possible that it's only wasy to scale the number of devices because mobile devices play nice with the network? For example, battery life depends on batching network requests, meaning the incentives are aligned between Google, Apple, and the carriers?
If every device defaults to treating the network like a LAN, like MacOS is accustomed to being able to do, that may change the part of the network that's easy to scale
Sheesh, what do you have against MiFi 4G pocket routers?
Yes, I mentioned that in the post you responded to.
> Not sure which apps, if any, respect it, but it's there
It reduces data consumption for me about 1/5. Not nothing, but the Mac can easily consume hundreds of GB of data a week doing "normal" activities. YouTube on a MacBook is many times more data than the equivalent on a phone screen.
My craving has been answered by the GPD WIN MAX 2, a 10" mini laptop with lots of ports and bitchin' performance (AI 9 chip sips battery). It's windows, but an upgrade to pro to disable the annoying stuff via group policy + never signing into a Microsoft account, it's amazing how much faster it is than a machine that's always trying to authenticate to the cloud before it does anything. Wake from sleep is also excellent which was the main thing that kept me using MacBooks. Anyway it's the first computer I've bought in a decade that has some innovation to it.
Edit: there's a slot for a cellular modem but I haven't done enough research to find one that will play nice on US networks
I hope this is the case. I don't know if I would buy a cellular MBP (just wouldn't use it enough) but better tethering is a huge win for me.
The only possible issue I can think of is battery life, but if I'm carrying around my laptop I can throw a charge cable in the bag to keep my phone juiced.
Why not? If I had both with me, I'd rather just have my phone on Airplane mode preserving the battery and my focus.
> Are you ever out and about with your computer but not your phone? I've been happy to hotspot my computers and tablets to my phone, which I always have with me.
I'd really really like to be. The amount of dependence I have on the phone being there at all times is insane. I just want to leave with my laptop and be good to go, no possibility of receiving a call or getting distraced with stupid group chats.
My phone recently died spontaneously, and if I didn't replace it immediately, I can't work online from cafes or anywhere else without depending on the place having open wifi.
We're discussing a MacBook someday with a built-in phone, the closest I've found is an iOS device wired to my MacBook as a wired hotspot. It's like having fast wifi everywhere.
Using my personal phone (that I also use for other things like calls) wouldn't be like having wifi everywhere on my Mac, for example if I walk away from my laptop while on the phone the Mac would lose internet.
The Apple Silicon chips all run in a version of always on these days because the efficiency cores are so, well, efficient.
Additionally, while you may want to burn the battery in multiple devices and deal with having to manage that, I don’t want to.
Apple has been selling cellular iPads since the beginning and I love never having to worry about pairing mine.
Tethering to an iPhone or iPad Is much better than it used to be, but it’s still not perfect.
Apple makes their own modems these days and even with Qualcomm had a capped per device license fee more than covered by the premium they charge for cellular in, say, the iPad.
I know so many people who want this convenience and are willing to pay for it that it just seems like stubbornness at this point that they’re willing to put modems in iPads and not MacBooks.
You sort of have to experience it first hand.
It’s just one of those things that it’s convenient not having to worry about WiFi when we travel and hotel WiFi depending on how busy they are is often pretty bad.
But especially with a laptop, as often as we travel, I don’t think I’ve ever needed to tether to my Mac accept for brief periods of times when our condos shared WiFi went out (I work remotely).
I wouldn’t pay for a separate line for a computer. I am sure others would.
On another note, I did give my mom my previous iPad and kept the data plan so she doesn’t have to worry about WiFi when they take road trips.
But then I don't even care about 5g versus 4g/LTE for the most part, so perhaps I'm just not noticing limits that affect others.
https://en.wikipedia.org/wiki/QoS_Class_Identifier
T-Mobile comes with 5GB of high speed data per month to use for roaming in Canada and Mexico and lower speed data roaming almost anywhere else in the world.
AT&T is "about as bad" as what? You gave no information.
I just tether to my phone. Wouldn’t that work?
edit: suggested retail price also dropped with EUR 100. Mind is less blown now. It seems like a good thing in fact.
edit2: in Belgium, the combined price of the 70W adapter and 2m USB-C to MagSafe is EUR 120.
[1] https://forums.macrumors.com/threads/new-macbook-pro-does-no...
USB-C chargers are everywhere now. Monitors with USB-C or Thunderbolt inputs will charge your laptop, too. I bought a monitor that charges over the USB-C cable and I haven’t use the charger that came with the laptop in years because I have a smaller travel charger that I prefer for trips anyway.
You don’t have to buy the premium Apple charger and cable. There are many cheap options.
I already have a box of powerful USB-C chargers I don’t use. I don’t need yet another one to add to the pile.
Takes like 10 hours and isn't officially supported I think, but it does work.
Nintendo I have no expectations for, but Apple isn't (IMO) that egregiously bad with backwards compatability
USB-C 15W Chargers may be everywhere, but higher power charger required for MacBook Pro is not.
I would have agreed if the devices is using 10W or 20W where you could charge it slightly slower. Not for a 70W to 100W MacBook Pro though.
I actually have very few USB-C chargers. With everyone leaving them out of the box, I don’t happen to have a bunch of them by chance. They took them out of the box before giving time for people to acquire them organically. I never bought a single lightning cable, but almost all my USB-C cables had to be purchased. This is not great, considering how confusing the USB-C spec is.
Other than the one that came with my M1 MBP (which I will lose when I sell it), I have had to purchase every charger I have.
Not being able to charge a $1,500+ laptop without buying a separate accessory is crazy to me. I’ve also seen many reports over the years comparing Apple chargers to cheap 3rd party ones where there are significant quality differences, to the point of some of the 3rd party ones being dangerous or damaging. I don’t know why Apple would want to open the door to more of that.
I assume a lot of people will use a phone charger, then call support or leave bad reviews, because the laptop is losing battery while plugged in. Most people don’t know what kind of charger they need for their laptop. My sister just ordered a MacBook Air a couple weeks ago and called me to help order, and one of the questions was about the charger, because there were options for different chargers, which confused her and had her questioning if one even came with it or if she had to pick one of the options. This is a bad user experience. She’s not a totally clueless user either. She’s not a big techie, but in offices she used to work with, she was the most knowledgeable and was who they called when the server had issues. She also opened up and did pretty major surgery on her old MacBook Air after watching a couple YouTube videos. So I’d say at least 50% of people know less than her on this stuff.
Apple positions themselves as the premium product in the market and easy to use for the average user. Not including the basics to charge the internal battery is not premium or easy. I can see it leading to reputational damage.
Had a similar issue with my 2018 MBP Intel - the 86/87 Watt Apple charger was the only thing it would come to life with as the battery aged if the device got too low.
In 2018 I had a phone that entered a boot loop: battery depleted, plug it in, it automatically starts booting, it stops charging while booting, it dies due to depletion, it recognises it’s plugged in and starts charging, boot, stop, die, start, boot, stop, die… I tried every combination of the four or five cables that I had with a car USB power adapter and someone’s power bank, nothing worked. Diverted on my way home (an 8 hour journey) to buy another phone because I needed one the next day. When I got home, I tried two or three power adapters with all the cables and finally found one combination that worked. I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
The solution is to keep your devices charged. This is feasible if you have a few devices. Not practical for someone like me. I have too many devices. I don't use every device daily.
In my experience a low-power charger will revive, you just must wait for it to hit enough SOC since it is effectively starting off the battery. This does take a while, but starting dead on a supply that can't guarantee enough power would be dumb.
Even a Studio Display, which can provide more power than my M1 Pro can use, won't wake it from this state. Apple wants $300 for a replacement battery so I'll just buy a new MacBook at that price, but the charger situation doesn't bode well for M5 MacBook buyers who wonder why their Mac is dead one day (and they just need the exact charger the system wants, but Apple didn't provide it)
Looks like iFixit shows thinks it's only a "moderate" difficulty replacement and should only cost you $109
https://www.ifixit.com/Guide/MacBook+Pro+14-Inch+2021+Batter...
I don't want to use a 3rd party battery in a device I carry with me most places I go...
I re-did the battery on my 2013 MBP well after the Apple support period (~2020). I don't think I'd try it on a still-supported Mac unless I was very price sensitive.
On the go, I've bought a small GaN with multiple ports. At home, I already have all of my desks wired up with a Usb-c charger.
This is especially true for someone moving up to an MBP from an MBA, which takes less juice.
Chargers don’t change quickly. If I lost my charger from 2019, the ideal replacement in 2025 would be literally exactly the same model—and mine still works like new and looks good. I have nothing to gain from buying a new charger.
We should be cheering the EU for ending an abuse that the US has long failed to.
Also, it still bundles a USB-C to MagSafe 3 cable.
If you sell your old laptop when you buy a new one, you generally sell it with old charger. And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
There's a reason they generally make sense to bundle. Especially with laptop chargers, which provide a whole lot more power than some random little USB-C charger you might have. Sometimes letting the free market decide actually gives customers what they want and find most useful.
Sounds like a symptom of incompatibility. I’ve only ever included the charger when it was specific to the laptop.
> And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
Chargers automatically provide whatever power level is needed, up to their max, and charging power isn’t the steady tick upward we’re used to elsewhere. The MacBook Pro did get a faster charger a few years ago, relegating old ones to that “compatible but not optimal” state, but meanwhile MacBook Air chargers got slower, and most releases didn’t change the charger. Certainly there are sometimes benefits to buying a new charger, but it happens much less often than new device purchases, and even when there are benefits purchases should still be the customer’s choice.
> Sometimes letting the free market decide actually gives customers what they want and find most useful.
I agree, but “free market” doesn’t mean lawlessness, it means an actual market that’s actually free. Actual market: companies compete on economics, not e.g. violence or leverage over consumers. Actually free: consumers freely choose between available options. Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
Only when there's no competition and you can use that to abuse market power.
But competition for laptops is strong. Most consumers want their laptops to come with a charger, even if you personally don't. That's why they're sold that way.
Like, nobody says the free market is failing because Coke forces me to buy carbonated H2O along with their syrup at the grocery store. The market prefers it when they're bundled.
> Most consumers want their laptops to come with a charger, even if you personally don't. That's why they're sold that way.
Citation needed, on both counts. Plenty of counter-examples in this thread. Non-tech people I know aren’t charger crazed, they’re mildly amused or annoyed by their inexplicable excess of chargers.
> Like, nobody says the free market is failing because Coke forces me to buy carbonated H2O along with their syrup at the grocery store.
I’d say it is indeed failed / nonexistent there, it’s just that nobody cares, because its potential benefit is so small it’s outweighed by overhead. Chargers aren’t laptops or cars or houses, but, as you said, there’s a lot more to them, and they’re more expensive and contribute significantly to e-waste. There actually is a charger market, and it’s better when it’s more free.
To be clear, the healthier market I’m envisioning is one where consumers can make charger purchasing decisions freely, not one where nobody’s allowed to also offer a bundle.
"Charger crazed"? Huh?
They're amused by too many cheap underpowered phone and small device chargers. Not laptop chargers. Those are bigger and you don't usually have any extra.
There isn't much of a "charger market" for laptops, except people who want a second one for a second location. I've never heard of anybody with a Macbook who wanted to buy a non-Apple charger instead. And now Magsafe is back!
Like, my Macbook also bundles a keyboard, a screen, a trackpad, a battery, and so forth. Sure the charger isn't connected with adhesive, but it's still a unified product. You need a charger to use a Macbook, and most people don't have an extra laptop charger with enough power otherwise.
Forcing them to be sold separately for laptops is just silly.
When it was announced, I expected it to be at least 4000 AUD (~2600 USD). When I heard it was starting at 1500 USD instead (~2300 AUD), I was astonished and very excited. And it still is that price… but only in the US. In Australia it is 4000 AUD (the 32GB/1TB model, which is 1700 USD, ~2600 AUD). So I sadly didn’t get one.
Is the rest of the world subsidising the US market, or are they just profiteering in the rest of the world?
Americans pay the same amount, but… stochastically.
PS: Health care is similar. Australians pay a fairly predictable amount via taxes and Medicare, Americans gamble with bankruptcy every time they break a leg. But hey, if they don’t break a leg then the “system works”!
> Under EU rules, if the goods you buy turn out to be faulty or do not look or work as advertised, the seller must repair or replace them at no cost. If this is impossible or the seller cannot do it within a reasonable time and without significant inconvenience to you, you are entitled to a full or partial refund. You always have the right to a minimum 2-year guarantee from the moment you received the goods. However, national rules in your country may give you extra protection.
> The 2-year guarantee period starts as soon as you receive your goods.
> If a defect becomes apparent within 1 year of delivery, you don't have to prove it existed at the time of delivery. It is assumed that it did unless the seller can prove otherwise. In some EU countries, this period of “reversed burden of proof” is 2 years.
As far as I know, the US has zero warranty laws. It can be zero days.
https://www.apple.com/legal/warranty/products/embedded-mac-w...
Apple could subsidize by absorbing part of the tariff in the U.S. and overcharging in the EU.
That said, in the EU we have a two-year warranty.
VAT in the U.S. is no more than 12%.
People in other countries will get pissed but ultimately suck it up and buy a product. People in America will take it as a personal offense due to the current Maoist-style cult of personality, and you'll get death threats and videos of them shooting your products posted onto social media. Just look at what happened to that beer company. No such thing would happen in Germany.
I was told the opposite thing would happen. Sounds like a great deal for us Americans!
Or a certain individual…
I'll take the discount and use one of my 12 existing USB-C chargers.
Compared to the marginal environmental impact to source materials, build hardware and parts, assemble, ship, stock, and transport to customer each unit, the box could be 10x larger and it wouldn't make a dent.
This is not how shipping works.
A larger box, even by 1 inch on any direction, absolutely makes a huge difference when shipping in manufacturing quantities. Let's not pretend physical volume doesn't exist just to make an argument.
10 planes flying with MacBooks == much different than 1 plane (in other words, when you 10x the size of something, as you suggest, it does actually have a huge impact)
A smaller box allows more to be carried. But if we go that route, it's trivial to ship them without any box and box them domestically - and that's a 2-3x volume reduction right there.
Ah yeah I can't imagine any scenario where this could go wrong
Like man in the middle attacks
Replacement/fake products
... or you know, damage? Boxes provide... protection.
> it's trivial
Anytime you catch yourself thinking something is trivial, you're probably trivializing it (aka think about it more and you'll probably be able to think of a dozen more reasons packaging products is the norm)
Prices are about 65 EUR for a 70W (tested DE + CH)
The EU law states they must provide an SKU without an adapter - i.e. they're still allowed to offer one with a power adapter.
Same, for a laptop??? Really? Wild. You can charge these with USB-C chargers too.
Germany: 1758 USD (1512 EUR) without charger.
US: 1599 USD with 70W charger.
This feels like is an insult.
I'm still pretty happy with my 16gb M1 Air, but it would be nice to know some closer to real world differences.
I’m confused — they made a comparison that is directly relevant to your situation and you don’t like it?
Most people with an M4 won’t be looking to upgrade to an M5. But for people on an M1 (like you) or even an older Intel chip, this is good information!
Macs barely got faster for ages with Intel - they just got hotter and shorter on battery life.
20% per year is a doubling every 4y. That is awesome.
When has 20% been impressive? When Intel to M1 happened, the jump was huge ... not 20%. I can't think of anything with a 20% jump that made waves, even outside of tech.
When I used to do user benchmarking, 20% was often the first data point where users would be able to notice something was faster.
4 minutes vs 5 minutes. That's great! Kind of expected that we'll make SOME progress, so what is the low bar... 10%? Then we should be impressed with 20?
People aren't upgrading from M1, M2, M3 in numbers... so I don't think it's just me that isn't wow'd.
Intel chips were getting faster. It's well documented (and glaringly obvious in the i9 16") that Apple just didn't want to accommodate the full TDP. They tweaked their ACPI tables to run the chips until they hit the junction temp so they were both constantly hot and constantly throttling. Apple tweaked all of their Intel chips in this way, which was a software solution to the Apple-designed hardware simply being unable to cope with the thermal stress.
We know this because the Intel Macbook Pro chassis was only ever used to run Apple Silicon chips that were passively cooled, not Pro/Max variants. The old MBP chassis designs are so awful that Apple doesn't consider them viable for cooling ARM CPUs. I blame Ive, not Intel.
Do you consider margin-of-error, single-digit gains to be worth arguing over? Intel offered 14nm for 4 years straight: Skylake, Kaby Lake, Coffee Lake, Coffee Lake Refresh—four different names, same process node, and 3-7% gains each year. Such fast.
> The old MBP chassis designs are so awful that Apple doesn't consider them viable for cooling ARM CPUs
You don't put a 15-20W chip into a thermal system built for 90W+. The old chassis wasn't "too awful" for Apple Silicon, it was completely unnecessary.
I don't really use local LLMs but think 32GB RAM would be good for me... but I am so ready to upgrade but trying to figure out how much longer we need to wait.
I got the cheapest m1 pro (the weird one they sold thats binned due to defects) with 32gb ram and everything runs awesome.
Always get the most ram you can in mac world. Running a largish local LLM model is slowish but it does run.
A mac out of memory is just a totally different machine than one with.
probably because most of the devs building the software are on the highest ram possible and there is just so much testing and optimization they dont do.
'Real-world idle' efficiency on the newer chips is the main reason I've got the (slight) itch to upgrade, but 64GB+ MBPs certainly don't come cheap.
From a buyer's perspective, I don't like it at all.
As an other example the current ultra part is the M3, and it was released early 2025, after even the M4 Pro/Max, and a good 18 months after the M3 was unveiled. We might not see an M4 Ultra until 2027.
SSD has double the speed. I guess they say this only for M5 MacBook Pro, because the previous M4 has always had slower SSD speed than M4 Pro at 3.5GB/s. So now the M5 should be at 7GB/s.
I assume no update on SDXC UHS-III.
I suspect the M5 Pro/Max chipped MBPs will bring some of these improvements you're looking for.
we went from 10 hours to 24 hours in 5 years - impressive
i wonder why they advertise gaming on the laptop, anyone plays anything Meaningful on macbooks?
I'm happy to hear your games work well for you, but it sounds like the games you're playing aren't demanding compared to modern AAA titles. Most games released in the last year or two don't run well on my 2080 test system at anything approaching decent graphics.
Whether or not the M5 GPU is actually capable of that level of performance or whether the drivers will let it reach its potential is of course a completely different story. GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
A 5060 outperforms a 2080 by roughly 20% on most titles, across the board, not cherry-picking for the best results. They are not about the same.
> you should be able to run at like 1080p High or better
This is disconnected from reality. 1080p low/medium, some games are playable but not enjoyable. Remember, I actually have a 2080, so I'm not just guessing.
> GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
Rich coming from someone who claims a 7 year old graphics card is "about the same" as a card which has 2.5x better RayTracing, has 3x faster DLSS, faster VRAM, and much better AI capabilities. The 2080 can't even encode/decode AV1...
Is this a typo? I’m surprised the difference is so small after 3 generations.
That's why I made the specific distinction in the comment you're responding to
When a $599 Windows laptop with a 3060 can play AAA titles and your $1599 MBP can't, I wouldn't normally call that great for gaming.
Most games just flat out do not work that I've tried. Plenty will stop you with anticheat.
About the only sorts of games that actually work well over WINE on this rig (m3 pro) are ones that came out 15 years ago.
Native games like no mans sky actually got worse over time. When I first got this mac I was so impressed by the performance of nms even though its an old game at this point. I could run it entirely on ultra. Then nms put out an update and that ended, back to medium-low and no AA unless I want to experience pervasive graphical glitches like flashing purple.
Other games have some internal lock to their fps I'm not sure why even as native mac games. This is true for cities skylines. It is capped at 40fps, maybe through rosetta layer limitations? I'm not sure.
The M5's GPU specs seem to put it near a high-end NVIDIA card from 2018. Impressive as all get out for a power-friendly chip, but not really what I think of when I hear "good for gaming"
The trend I see that is more concerning is that previously mac friendly game devs have already abandoned the platform. Valve no longer maintains mac os builds of their games like cs or tf2. City skylines 1 had a first party mac release but City skylines 2 skipped mac os.
However, it is not just because of the larger display.
M5 14" starts at:
10-Core CPU
10-Core GPU
16GB Unified Memory
512GB SSD Storage
M5 16" starts at:
14-Core CPU
20-Core GPU
24GB Unified Memory
512GB SSD Storage
So it's the cost of 4x more core CPU, 10x (double) the core GPU, and +8GB memory.
Also, I read that the keyboard is slightly different between the Air and the Pro, I'm not a big fan of that chicklet design they released
Everyone buying their high end gear is buying something waiting to be refreshed now.
AMD is somewhat of an exception/unique case though, having chipsets and monolithic depending on the use case and console/semicustom offerings, so that doesn't map fully.
Also, let's not forget in Apples case, that they actually go phone first, the Air+iPad, then Pro and finally Studio. Feel that the lower end devices should priority personally though, efficiency gains are more valuable in connected devices with limited space for batteries over my 16 incher with 100wh.
Course, would be nice if we just got the entire range updated at once, but I doubt even Apple could pull such a supply chain miracle off, even if they bought all of TSMC and the entire island to boot...
Most of their buyers aren’t buying the highest end parts. Those are a niche market.
Focusing on the smaller parts first makes sense because they’re easier to validate and ship. The larger parts are more complicated and come next.
Smaller chips means more of a wafer is usable when a defect exists
The standard practice is to start by producing the chips with the smallest die size.
Buyers who walk into an Apple store for a base MacBook Pro will wait if they hear a new model is coming out. So if you have a buyer basing purchases on the generation number, it makes sense to launch that model as soon as possible.
Pro/Max buyers generally are checking into specs and getting what they need. Hence the M2 Ultra still being for sale, for some niches that have specific requirements.
Looks like the Pro and Max will be on a three month delay.
People in the U.S. are starting to think about their Christmas shopping lists right about now.
(I had just been looking at macs a few weeks ago, and had noticed how close in price macbook pro and macbook air were for same specs -- was thinking, really no reason not to get pro even if all I really want it for is the built-in HDMI. They are now more price differentiated, if I am remembering right).
https://www.npr.org/2025/04/12/nx-s1-5363025/apple-iphone-ta...
This has been their staggered release strategy for a while.
Still no M4 Ultra Studio available.
- normal - pro - max
pro and max had way more cores and gpus and supported way more ram. today's release is the basic version of the new cpu; if you want more ram you can get the m4pro or m4max based MacBook Pros, or wait for the M5pro/max to come out.
Bad news for anyone who buys the M5 MacBook Pro as an "AI" machine and finds it can't fit any of the more interesting LLMs!
At this point, I get the soldered on ram, for better or worse... I do wish at least storage was more approachable.
Exceptions apply to those running local LLMs.
Their sales copy for reference:
"M-series chips include unified memory, which is more efficient than traditional RAM. This single pool of high-performance memory allows apps to efficiently share data between the CPU, GPU, and Neural Engine.... This means you can do more with unified memory than you could with the same amount of traditional RAM."
Still not the fastest ram, that they use for dedicated GPUs, but faster than most x86 options.
I’m not trying into be a fanboy and maybe it’s a little bit “cope”, but apple has always put as much RAM as is necessary for the computer to work—and not a lot more—in their base models.
:)
You're not silly, you're just able to see reality.
Apple knows who is buying the bulk of their computers, and it isn't power users ... most people buying computers don't have a clue what RAM is even used for.
I'd hit beachballs, but macOS balances 8GB of RAM fine even with Tahoe for regular users
I know people complain at every release. But I look at the three choices presented and they are all disappointing to me. It's a huge turnoff to see the only initial differentiator presented to be a choice between "measly" amounts of RAM and storage to "barely acceptable" amounts.
To get even close to the specs on my Surface Pro I'd have to hit the configurator and spend at least $1000. Even more to hit the config of my work issued HP notebook.
- m4 -> m5, same core number and distribution, "neural accelerators", higher memory bandwidth
- max storage increased from 2 to 4TB (and probably an extra kidney in price)
Everything else is strictly identical.
The marketing blurb claims 2~3x the performances of the M4 for AI stuff (I assume that's the neural accelerators), 20% for CPU tasks (compiling), 40~80% for GPU tasks (gaming and 3D rendering).
Not to mention the M4 pro and max released 6 months after the M4. If that holds for M5, it won’t be this year.
It used to be a little less 'weird' when the base M-chips were only available in the Air and 13" MBP.
Sounds like maybe they didn't want to try and fit their new N1 chip this go around so they could re-use some components? MacBook still has the same broadcom chip. Or for a pro differentiating feature when the M5 Pro/Max comes out later. There's a rumored MBP re-design, so I'm guessing we'll see it then along with it having the N1 for WiFi 7.
Even pre-Apple Silicon, it's been a decade since users could upgrade MacBook's RAM or internal storage.
What do you do on wifi that requires more than 10gb per seconds... on a laptop, you'd fill up the base model ssd in under a minute of download
Apple M5 Chip
https://news.ycombinator.com/item?id=45591799
I have to say if I had any choice I would delay my purchase until the 16” catches up rather than buying a generation behind. If I see specs saying M5 14” is more performant for my workloads than my more expensive 16” I’m even more motivated to delay. Most product managers would be aware of these things.
I can see an overlap with people who want smaller computers who also want max power, but I just would not believe that is a significant group. (again, all personal observations)
I also think the 15 inch MacBook Air filled the non-power-user-but-likes-big-screen niche.
I never really used the local LLMs since I can always go to claude but that's always on my irrational brain to spend irresponsibly for that.
I would be happy to sacrifice the EU keyboard and have the AI instead :-)
Did they announce this or are you speaking for Apple?
This has been their release strategy for past generations.
The Pro/Max rollout tends to lag behind by about 6 months.