What an end to an era. It's crazy to think she started this journey at 18 and now finished 5 years later. Not many people believed they would be able to make the GPU work in Asahi linux. Kinda curious what her "Onto the next challenge!" link means. Is she working for Intel Xe-HPG next?
kccqzy 3 days ago [-]
Yes I think so. Her resume says she started working for Intel on open source graphics driver this month.
chao- 2 days ago [-]
Wish her the best with this. Intel staying competitive in GPUs can only benefit the consumer. Those who want a mid-tier graphic card, without paying to compete with AI use cases, may not a huge group, but we do exist! Those who use desktop Linux may be a small group among that small group, but we do exist!
xiphias2 3 days ago [-]
Too bad it was not Apple who hired her for M4, but in business leaders are always the most closed ones.
homebrewer 2 days ago [-]
Thanks Jesus it's Intel and not Apple, Intel has been extremely good at working upstream and has immense contributions in the Linux kernel, mesa, and elsewhere. Wasting such talent on Apple would make the world worse for us all.
qingcharles 2 days ago [-]
Intel seem to be pretty good in the .Net framework too, pushing a bunch of performance upgrades.
MangoToupe 2 days ago [-]
I don't see a future for intel, frankly, but I'm very happy she found a good paying job.
tlamponi 2 days ago [-]
There was a time when people said that about AMD.
Don't get me wrong, Intel's outlook is IMO currently indeed rather bleak, but I would not completely write it off just yet.
CharlesW 2 days ago [-]
> There was a time when people said that about AMD.
And Apple, to complete the circle.
MangoToupe 2 days ago [-]
AMD is equally fucked. Building off of IP-locked architectures is just a graveyard. Even apple will hit a wall one day.
Incipient 2 days ago [-]
There are a myriad of companies that have thrived in "IP locked" environments, a host that have failed too. Equally there are heaps that have thrived and failed in "IP open" environments.
I think at best you could say it's more challenging or perhaps risky being a bit restricted with IP, but I'd call it miles away from a "graveyard".
You can hardly call Intel/amd/qualcomm etc all struggling due to the architectures being locked down.
Look at powerpc/Isa. It's (entirely?) open and hasn't really done any better than x86.
Fundamentally you're going to be tied to backwards compatibility to some extent. You're limited to evolution, not revolution. And I don't think x86 had failed to evolve? (eg avx10 is very new)
koakuma-chan 2 days ago [-]
Why? Apple makes good products. It seems, unlike Intel.
kccqzy 2 days ago [-]
Apple doesn't have any contributions to the Linux kernel or other parts of the Linux graphics stack. They are unlikely to hire someone who wants to work on open source.
jajuuka 2 days ago [-]
To be fair they don't have anything to do with Linux so there is nothing to contribute back towards. They use BSD licensed software for a reason.
Apple does have open source projects. https://opensource.apple.com But the scope is rather limited. For someone of Alyssa's skillset there really isn't anything there.
MangoToupe 2 days ago [-]
Sure but linux isn't necessary. Legacy software can be virtualized
dylan604 2 days ago [-]
Apple hiring her would essentially prevent her from doing it again on other models keeping the moat intact
jandrese 2 days ago [-]
Apple doesn’t contribute back to the community.
mckenzba 2 days ago [-]
From what I recall, Apple forbids its employees from participating in open source work it doesn't approve of. And given Apple's culture of secrecy, its agenda of maintaining a walled garden with their products, and her work basically contradicting the two, I doubt her being hired by Apple would benefit anyone other than Apple.
ta988 3 days ago [-]
Apple is too much about beeing closed and creating barriers not sure that would have been a good fit. Plus that's a good way to flee a country quickly degrading.
porphyra 2 days ago [-]
Honestly if Apple had embraced Linux, the Apple Silicon CPUs would have been amazing for all sorts of server, scientific, and AI/LLM work. Too bad they are clamping down on the walled garden to focus on consumer toys instead.
benoau 2 days ago [-]
The real shame is the longevity, M1 Pro and M1 Max got discontinued two and a half years ago so they're on their way to the vintage list and could be entirely obsoleted by the end of this decade! Linux support is the only thing that will keep these machines usable after that.
zozbot234 2 days ago [-]
> Honestly if Apple had embraced Linux, the Apple Silicon CPUs would have been amazing for all sorts of server, scientific, and AI/LLM work.
You can already do this work on M1/M2 using Asahi. A compute server doesn't need fully working peripherals and external displays.
porphyra 2 days ago [-]
I do have an M2 Macbook running Asahi, which works amazingly well for my casual use, but I think that there is no way that anyone will use last last gen hardware on a volunteer-developed OS for any actual work, server use cases, and so on.
solardev 2 days ago [-]
How would GPGPU work in such a scenario?
zozbot234 2 days ago [-]
The M1/M2 GPU is supported via Vulkan Compute. (Or OpenCL/SYCL, going through rusticl.)
solardev 2 days ago [-]
Thank you!
MangoToupe 2 days ago [-]
Really a matter of perspective, tbh. Linux is also quite toy-like in its mess.
monocasa 3 days ago [-]
Good luck to her. That's one of the pieces of Intel I think will survive its slow motion implosion.
pjmlp 2 days ago [-]
Hardly, Phoronix has a few reports from Linux driver folks being layed off at Intel.
monocasa 1 days ago [-]
Everywhere in Intel will be subject to layoffs.
My point was that the graphics division itself will still be around, as integrated mobile SoCs are basically the only revenue stream Intel still has a good handle on. That requires a graphics core, and all of the other usable options are either not for sale to Intel, have burned Intel in the past, or are owned by Arm.
porphyra 2 days ago [-]
I just hope that Intel doesn't squander the talent like they did with Jim Keller.
brookst 2 days ago [-]
Intel’s core competence is squandering talent by having finance managers and outside consultants make technology business decisions. Something happened to their culture a few decades ago and they forgot that revenue is a trailing indicator of good decisions and you can’t just decide you want to make a lot of money and trust the product strategy to materialize from that.
lotsofpulp 2 days ago [-]
Intel still has all the same short sighted bosses, the board of directors at Intel hasn’t changed.
frabonacci 2 days ago [-]
From "draw a triangle" to upstream Vulkan on M1. Practically, this makes the Venus/virtio path viable for guests on Apple Silicon (no passthrough in VZ), which is what many people actually need.
marsven_422 2 days ago [-]
[dead]
jasoneckert 2 days ago [-]
This is an incredible achievement... not just for the technical depth, but for what it represents. Alyssa's work is nothing short of inspiring. The way she combined deep technical insight with years of dedication has not only brought open-source graphics to Apple Silicon, but also lit a fire under reverse engineers and open-source developers.
She has shown a whole new generation that curiosity and persistence can break barriers. I thoroughly enjoyed watching the developments these past several years. Massive respect to her and everyone who made this possible, and kudos on her new position at Intel.
zgwiuuurt 2 days ago [-]
[flagged]
xyzsparetimexyz 2 days ago [-]
What?
berbec 2 days ago [-]
Just random internet bigotry - Alyssa Anne Rosenzweig is transgender. [1]
The evidence of my eyes is two anonymous troll accounts that were just created for this thread. HN mods should really do something about it, it's way too easy to sign up and ban evade.
nemomarx 1 days ago [-]
When someone tells you they prefer going by a nickname, do you also complain that they're forcing you to ignore the evidence on their paperwork and refuse?
judge123 3 days ago [-]
The author basically speedran modern graphics APIs on 'impossible' hardware and then just... walks away. Total mic drop.
adrian_b 2 days ago [-]
Switching to work on Intel GPUs is not walking away.
It is accepting a new challenge.
adastra22 2 days ago [-]
It is walking away from her user base. (I’m not complaining — I’m an open source dev too and recognize I have no right to put demands on her time. But what is the future for Asahi Linux after this? I don’t see one.)
3 days ago [-]
anon-3988 3 days ago [-]
1. student at uoft
2. a lead in a job at Collabora
3. very succesful and ambitious hobby project
how tf does she juggle and managed to do all this? I can barely do one of the above properly.
kubb 3 days ago [-]
One of the few people who are actually competent.
Although most likely she’s well compensated, and doesn’t have to waste time on useless efforts at work, this level of discipline and striving towards a goal is just very rare in general.
Possibly also no family, limited social life and no other hobbies.
jonathanlydall 3 days ago [-]
For myself, when I lived on a different continent to my family, had limited social life and job with strictly set hours, it was much easier to have the time needed to make significant progress on a hobby.
However, discipline is an enormous factor too, actually using that extra available time on something “productive” is no easy feat.
Now I have kids and live in the same area as my parents and siblings again, entirely happy, but less free time.
Cthulhu_ 2 days ago [-]
One of the unspoken benefits of being young, you're unlikely to have grown into a management position and can focus on not-management stuff.
kubb 2 days ago [-]
This is what managers tell themselves to feel better about their idleness, but in the end it’s just another excuse.
Every person is different of course, there might be this one brilliant engineer forced to manage against his will somewhere.
actionfromafar 2 days ago [-]
Forced by financial concerns, a decent bunch I'd say.
kubb 2 days ago [-]
Fair, fair. I’d take a salary bump, because it affords an illusion of being able to escape the Cage faster.
ddddang 2 days ago [-]
[dead]
tmp20250827 3 days ago [-]
2021 and 2022 was also when many places were only just coming out of COVID lockdowns. I remember how much dead time I had back then. I used it to watch lots of series and youtube videos. I wish I had the discipline and motivation to work like she did during that area with all that free time.
abustamam 2 days ago [-]
Yeah same. I made a dent in my gaming backlog and TV shows and built a gaming computer that has only ever been used to play Factorio (notably, a game that can probably be played without a GPU).
Half of me kinda wants another lockdown so I can do more discipline-y stuff but the other half is like, dude you're just gonna waste it playing more games. I just gotta face the music - I'm just not disciplined and I just don't have the drive.
jandrese 2 days ago [-]
She does a bunch of social media stuff with her girlfriend on top of that, so “limited social life and no other hobbies” may not be a good description.
MegaDeKay 2 days ago [-]
Many people are competent. She's exceptional.
xyzsparetimexyz 2 days ago [-]
Ignorant comment. I'll happily be incompetent if it means I have those things
e40 2 days ago [-]
Sounds like one of those mythical 10x engineers!
comonoid 2 days ago [-]
Perhaps, "Deep Work" by Cal Newport can explain that.
ornornor 3 days ago [-]
Pretty cool. She’s achieved more at 23 than I have after over a decade in the industry. What a talented engineer.
iwontberude 3 days ago [-]
No clue who you are, but real talk she’s achieved more than I will in my entire life. I’ve been in the industry for decades.
sheepscreek 3 days ago [-]
She just started working at Intel in August and has already accomplished more than most would in a year[1]. Incredible!
Just to say a big thank-you to the Asahi team, and especially for the GPU work. It is still on my list to get back to some OpenGL dev work. Especially since I recently made fedora-asahi remix my daily driver, and I have to say it is amazing. It feels like I once again own my computer.
Their work has inspired me to continue bashing away at my Zig PinePhone code, although I'll never have the skills to get it's GPU running anything beyond a poke'd framebuffer.
That checklist of supported APIs in Asahi is mind blowing, especially in such a short timeframe. Again, well done, thank-you, and best of luck at intel.
allenrb 3 days ago [-]
Not much to say beyond a hearty “well done, you!” That, and looking forward to see what’s next.
sangeeth96 3 days ago [-]
Inspiring stuff! I didn't even expect basic Linux support on M1 to be so good in such a short time-span, leaving graphics aside. I was very pleased when I tried booting up Asahi on M1 a couple months back and went on to get work done in it and even enjoy some games.
Thanks for all your amazing contributions Alyssa and all the best for the road ahead!
szidev 2 days ago [-]
Alyssa is such an inspiring individual. I'm glad she's working on the things that interest her.
jacquesm 2 days ago [-]
What a project. Of all of the IT work that I'm aware of I have a hard time choosing between this and Fabrice Bellard's output, both are - for me at least - equally impressive.
tiffanyh 3 days ago [-]
Kind of amazing Alyssa didn’t end up working at Apple (instead of Intel).
ninjin 3 days ago [-]
They seem closely aligned with the Free Software Foundation (FSF), so I could very well imagine that being a major ideological reason not to want to work with Apple. Yes, Apple sometimes upstream patches and they do contribute to open source here and there, but they certainly are no FSF poster child. Intel on the other hand are about as open as it gets when it comes to their track record in the graphics space. I personally have nothing but admiration for Rosenzweig's work and I hope they will continue to find environments where they can flourish and do great things in the years to come.
ndiddy 2 days ago [-]
Alyssa's post mentions how lots of the work she's done has at least started as side projects while she's working on something else (Panfrost while at high school, M1 drivers while at Collabora). Obviously I'm not her so I can't say anything specific to her. In general, Apple doesn't allow its developers to work on open source projects on the side while employed at the company. I think this is a stupid idea that costs them a lot of talent, but I doubt Apple cares what I think. I've seen multiple cases where an active open source contributor gets hired by Apple, then their presence in open source communities vanishes. Based on all the open source work she's done so far, I think it would take a lot to make her stop all contributions like that.
sroussey 3 days ago [-]
Maybe she didn’t pass a leetcode interview. :p
technofiend 3 days ago [-]
You do have to wonder how that kind of interview would go. Hopefully it would be actual engineers that created what she reverse engineered instead of some gatekeeper trying to one up her somehow.
wmf 3 days ago [-]
Maybe she doesn't want to.
GeekyBear 3 days ago [-]
You set an ambitious goal and executed beautifully despite a very busy schedule.
Well done.
blu3h4t 3 days ago [-]
May I ask something, I want an apple silicone MacBook Air and I am probably just be running Linux on it, what are pros and cons of getting an m1 vs m2? Except for more ram or so.
Thx
Perz1val 2 days ago [-]
The short answer is that it's just a stupid idea (and a waste of money). Asahi only works somewhat ok on M1.
Jnr 2 days ago [-]
Agreed, it is not that stable/usable.
I tested it on M1 Pro and was hopeful, but after some years I realized it is not viable for daily use. Many things still don't work and I doubt that they will any time soon.
Last year I was given M4 Pro at work and it is not supported at all.
Looking at the drama and people stepping down, I don't think MacBooks will be properly supported on Linux in this decade.
flkiwi 2 days ago [-]
On the other hand, I have an M2 Air and it's stable, fast, and I haven't thrown anything at it that it doesn't handle perfectly. But the fingerprint reader doesn't work.
(The M3/M4 are in progress but not supported. That's public on the project's compatibility chart.)
Jnr 2 days ago [-]
The biggest deal breaker for me was no support for external displays (through DP alt mode/thunderbolt).
Also the infrequent random OS crashes were annoying. And sometimes WiFi would stop working after sleep (wold not show any access points) and would require a reboot.
M1 is 5 years old already and is still not fully stable and lacks features. It seems like the overall development effort started slowing down a couple years ago and while we did get the amazing audio daemon and graphics driver, development of other things seem to be stuck.
If I remember correctly, there were also some comments from Marcan (?) on social media about issues with supporting newer chips (M3/M4), hinting that M3 and M4 are vastly different and require significant effort to add Linux support.
So if M3, M4 and other future versions are too different to get supported in decent time frame, then that means that Asahi is all about supporting years old hardware. That reduces interest by Linux users looking to buy a laptop now, and thus potentially reducing available donations, developer pool, interest, etc.
I love what Marcan, Alyssa, James and others have achieved and how they have pushed Linux further. I think that their contributions will stay relevant and be useful for other hardware for many years to come.
blu3h4t 2 days ago [-]
Sorry but I just can let it, I bought a Microsoft dev kit 2023 just to test hundreds of gigabytes of windows software I’m responsible for would deploy on it with the system center :D
If that software would also Work ofcourse is another thing. :D
brabel 2 days ago [-]
Are you coming from Windows? MacOS is a BSD descendant so it’s quite Unix-y. I never miss Linux on it and I used to only use Linux. Just learn how to get around the minor annoyances (eg the file explorer sucks , I use eMacs for that) and it’s a fine OS. It’s really not worthwhile trying to install anything else on the Mac.
pjmlp 2 days ago [-]
macOS as UNIX is pretty fine for anyone that is happy with UNIX, and isn't looking for yet another Linux distribution.
Now anyone that treats it with the attitude that whatever Linux distros do is UNIX, there are enough surprises in there.
SSLy 2 days ago [-]
m2 has magsafe
dreamcompiler 2 days ago [-]
M2 Air has magsafe. M1 Air does not.
SSLy 2 days ago [-]
yes, that's spelt out in the context above.
brookst 2 days ago [-]
My M1 Pro has MagSafe.
MBCook 2 days ago [-]
GGP was asking about the air though.
brookst 2 days ago [-]
Ah, my mistake.
4gotunameagain 2 days ago [-]
What is this, quora ?
Tiberium 3 days ago [-]
Sorry to hijack, but since the topic is related: is the development of Asahi Linux still actively ongoing, or has slowed down a lot? The progress for M1 and M2 was steady and now almost everything is done, but the M3+ work still seems to not have started. And with major contributors leaving the project I'm kind of worried for the future of Asahi (on newer Apple hardware).
GeekyBear 3 days ago [-]
The new leadership team set a short term goal of getting their existing work upstreamed, which seems to be going well.
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
While we still have quite a way to go, this progress has already made rebases significantly less hassle and given us some room to breathe.
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
So if the discussions are true, it can take years for the developers to finish M1/M2 upstreaming with all the Linux kernel bureaucracy. That is, unless they decide to start working on M3 before finishing the upstreaming
zozbot234 3 days ago [-]
Makes sense, every patch they upstream is less maintenance and forward-porting work that they have to do. Keeping a downstream kernel up to date is very painful, even one that's "near mainline" as with Asahi's.
laweijfmvo 3 days ago [-]
i hope some day a used M1/M2 macbook air will be the greatest linux laptop around
rc00 3 days ago [-]
I would hope not. That would mean that no other vendor has shipped working ARM hardware support for Linux or has upstream support in the kernel. Forget the hostile nature Apple has proven to possess when consumers dare treat their hardware as if paying for it makes it their own.
Qualcomm has been beating the marketing drum on this instead of delivering. Ampere has delivered excellent hardware but does not seem interested in the desktop segment. The "greatest Linux laptop around" can not be some unmaintained relic from a hostile hardware company.
finaard 2 days ago [-]
As somebody that has worked in a company that did Qualcomm devices in the past - Qualcomm just cares about money grabbing, and is not any less hostile to developers than Apple.
If you want to do a device, and your only chip option is Qualcomm I'd recommend not doing a device at all.
zozbot234 2 days ago [-]
FLOSS stacks for Qualcomm-based devices are actually a lot more feature complete than some other brands like MediaTek or Exynos. Still nowhere near any kind of "daily driver" status but at least getting somewhere, whilst others have yet to even get started.
lostlogin 2 days ago [-]
> I would hope not. That would mean that no other vendor has shipped working ARM hardware support for Linux or has upstream support in the kernel.
Can you see any other machine coming close to a Mac in terms of hardware quality and performance? Obviously the cost is silly, but while I agree with your sentiment, it seems optimistic to hope.
jacquesm 2 days ago [-]
Networking is going to be another major issue. Even on the Intel MacBook Pro this is still a problem. The instructions for getting it to work are so bizarre that I ended up with a network dongle with a supported chipset instead.
neobrain 2 days ago [-]
Good news for you: Networking (ethernet/wifi/bluetooth) on M1/M2 have been working perfectly fine for a while and don't require any special tinkering.
jacquesm 2 days ago [-]
Oh that is good news. I'm almost tempted to try that out.
neobrain 2 days ago [-]
I can recommend it! I've been daily driving M1 for a few months now, it's working really well. Parent poster is raving about a potential "greatest linux laptop", but depending on your use case it's already there.
IME the Asahi support page is spot-on: There are a couple of yet-unsupported features (DP-alt mode being a big one), but any feature listed as supported will just work without hidden gotchas. I find this a big contrast to other devices, which will often "work" but have annoying little quirks here and there that are workable but can feel like a downgrade compared to Windows.
jacquesm 2 days ago [-]
How is battery life?
neobrain 2 days ago [-]
On macOS, I never worried about battery life when leaving the house even when doing compute-heavy work. On Asahi, that is equally much true. I couldn't tell you how many hours it lasts because I never have to carry a charger unless I'm out for more than a regular workday.
There's some room for improvement, but that is purely relative to macOS. Asahi still solidly beats other x86 devices (other than the low end ones you wouldn't do development work on).
One issue is that idle battery consumption is higher than on macOS (an active area of improvement though [1]), which you'll notice by an M1 laptop discharging by about 12% overnight when macos would've eaten maybe 2-3%. Not a big issue normally, but can be inconvenient if the device shuts down due to empty battery overnight.
During more passive uses at daytime (e.g. playing music), the display tends to be the biggest power hog. Not really Linux-specific, but I actively turn off the screen when not needed hence (KDE lets you configure the power button to do so).
Ok. Will definitely look into this, thank you for all the time you put into the reply.
Keyframe 3 days ago [-]
I'd pay easily let's say $100-200 a year to have linux running on modern apple laptops with full features. I'm sure I'm not alone. Their hardware, "our" OS would be perfect. Well, except notch and lack of OLED - but, reportedly that's in the works too.
the macbook pro uses MiniLed, in term of contrast, it's quite good, much better than ips.
Macbook pro display is one of the best laptop display.
swiftcoder 2 days ago [-]
And somehow it has way less blooming than other mini-led displays I have used. Not clear how they pull that feat off exactly
zozbot234 2 days ago [-]
Most likely, they have more mini-leds and/or more ability to independently control them. Of course the localized "blooming" of mini-leds is a lot easier on the eyes regardless than the all-around bloom of a backlit display.
(Better for the battery too, if you can keep most of the screen dark.)
ozgrakkurt 2 days ago [-]
It is not even close. Pretty much any OLED is so much better
zozbot234 3 days ago [-]
The M3+ GPU is also very different. So while it may be true that the driver development for M1/M2 is now more or less complete as OP says, future work along the same lines will very much be needed.
hanikesn 2 days ago [-]
>The M3+ GPU is also very different.
Any sources for that? I'd be quite surprised if Apple had radically altered the architecture.
ykl 2 days ago [-]
This is a pretty well known thing; the M3/A17 generation GPU was a ground-up redesign that added things like dynamic caching and hardware ray tracing [1] which are highly nontrivial to simply extend an existing architecture to support. Unfortunately I can’t find where I read this, but IIRC at the time M2 came out there were expectations that M2 would have a new GPU architecture with hardware ray tracing but this wound up being delayed to M3 because it took longer than expected to do a ground-up redesign of the GPU.
>For years I juggled my courses with my part-time job and my hobby driver
Man I wish i had half of the energy of this author.
dang 3 days ago [-]
> I wish i had half of the energy of this guy
Trolling will get you banned here, so please don't.
mrheosuper 3 days ago [-]
i swear i'm not trolling, i had no idea the author is woman.
dang 3 days ago [-]
Sorry for misinterpreting you! All we can do is pattern-match, and sometimes the pattern doesn't match.
wltr 2 days ago [-]
So, you are not going to unflag, undead the comment you killed, right? The one you are sorry about, claiming you misinterpreted them. Just curious, not trolling.
dang 2 days ago [-]
Ah good point. Normally I do that, but I forgot to do it here. Fixed now.
bloqs 3 days ago [-]
not your fault, she is trans so sensitive subject. She generally hides references to it, but her picture is on her site linked by others.
That means … what? I censor whatever I like (dislike), I guess.
dang 2 days ago [-]
"Moderation is guesswork" means that we can't always know what someone's post means, or was intended to mean, nor can we know for sure what effects it will have. We can only make guesses (i.e. interpretations), and those are inevitably wrong sometimes.
We can't not make mistakes. The best we can do is acknowledge when we make a mistake and do what we can to fix it.
> I censor whatever I like (dislike), I guess
If you knew how many comments I dislike on HN, you would no longer have that perception.
(Why) are the M3 and M4 really that different from M1 and M2?
dagmx 2 days ago [-]
They’re significantly different GPU architectures. They added support for hardware features like mesh shading, raytracing and better shader occupancy/dynamic caching.
Beyond that, each M series generation also brings more of the system into the SoC. For example, the entire storage controller is part of the SoC in the M1, but the M2 brought in the trackpad controller as well.
Bringing more functionality into the SoC has many advantages but it does make it more difficult to target because you can’t just make use of existing off the shelf controller knowledge to apply to it.
Fokamul 2 days ago [-]
I'm curious, why are not these people hit with C&D from Apple?
And other great projects, like Corellium (Actual iOS VM, not that crap Apple makes) are hit hard with lawsuits etc.
(You know, great project for these people who is still RE iOS for 0days and report them to Apple, which is behind me long time ago, reporting 0days for peanuts, yeah right :) )
MBCook 2 days ago [-]
The Asahi team had made comments that it’s clearly Apple is ‘silently encouraging’ what they’re doing.
With all of Apple’s secure boot stuff they had more than enough ways tot totally squash running alternate OSes on the machines like a bug.
Instead they seem to have gone out of their way in a few places to make it not only possible but secure.
They’ll NEVER say anything publicly, or give documentation, but they’re leaving doors open on purpose.
creesch 2 days ago [-]
If I had to guess. One seeks to reverse engineer hardware to run an open source OS. The seeks to emulate a platform to run a proprietary closed source OS.
If I remember correctly, Apple at the introduction of M1 made some explicit statements about the hardware not being locked down. Something along the lines of nothing preventing Linux to run on it.
Perz1val 2 days ago [-]
I remember apple even made some change that asahi devs were really happy and said "this is for us" (marcan's tweet iirc)
dagmx 2 days ago [-]
Completely different projects in terms of what they’re providing.
Correlium was selling and distributing access to Apple’s software along with security bypasses.
Asahi is not redistributing any Apple IP, are using Apple sanctioned methods to run, and are not commercial.
2 days ago [-]
jajuuka 2 days ago [-]
Biggest thing is Asahi project is not a for-profit venture. It also mutually benefits Apple since that encourages hardware sales from non-traditional customers.
saagarjha 2 days ago [-]
Corellium being a commercial product (and thus a company you can sue) probably made it easier.
mrcwinn 3 days ago [-]
Huge respect for her. I consider myself a talented software engineer. I can’t get anywhere close to this accomplishment.
abnercoimbre 2 days ago [-]
Genuine question: is she a once-in-a-generation prodigy? We forget this class of people indeed exists. As fellow professionals are we inspired or deeply ashamed of ourselves?
rowanG077 3 days ago [-]
Honestly kind of heartbreaking to see her leave asahi Linux. She has done insane work building the vulkan driver from scratch. I wish her well working at Intel. If I ever buy an Intel GPU I can rest much easier it will work well on Linux. If she is working on the Linux driver stack that is.
finaard 3 days ago [-]
There isn't really anything left to do for her - everything missing (including work on the newer graphics chips) can be somewhat easily done by less talented people, building on her work.
She did the challenging stuff she cares about. One aspect of nerd brain often is that you can hyperfocus on challenging stuff, but can't get the motivation to work on stuff you don't care about - and even what would be a 20 minute task can end up taking days because of that. It's great that she has the self awareness to set goals, and step away once they're done.
I didn't have that in that age - and still sometimes struggle. I was lucky enough that my employer back then recognized my issues, and paired other people with me for doing the stuff I was not interested in, and now usually manage to load those issues onto other co-workers by myself.
zozbot234 2 days ago [-]
This is of course great as long as you can find enough "challenging" work to perform, but any successful project is going to involve a whole lot of seemingly "boring" work. A big part of true maturity and professionalism is being able to find the interesting challenge even in these more run-of-the-mill tasks and successfully engage with them.
(Mind you, I'm not talking about a matter of inborn temperament or character, much less a moral flaw! Rather, finding the compelling challenge even in "boring" tasks is a valuable skill and situational tactic that anyone should explicitly learn about and aim to acquire as part of becoming a mature professional, not a matter of morality or somehow being dismissed as "lazy"!)
rowanG077 3 days ago [-]
M3, M4 and soon to be M5 are ready to be cracked open :). From what I understand they are actually different somewhat hardware wise. So it's really not like there would not be a continuations of this work. But of course it's natural to want something else after years of working on the project.
finaard 2 days ago [-]
Take into account that she's focusing on the 3D stack, not the overall hardware. Even with hardware differences there's a good chance it's not different enough to make it an interesting new challenge.
rowanG077 2 days ago [-]
Yes I meant the GPU specifically not the hardware in general. An example is the support for hardware ray tracing in M3 and beyond. In some now deleted french fediverse post Alyssa indicated M3 has a new architecture.
zozbot234 2 days ago [-]
Given the features that have been advertised for the M3+ graphics and compute stack, there's rather a good chance that it is different enough to create big, new challenges for third-party support.
r_lee 2 days ago [-]
Sounds like ADD to me. Easy to be labeled as "Lazy" etc.
Fokamul 2 days ago [-]
ADD is an old term -> ADHD. Nvm.
I wouldn't be so quick to judge someone for ADHD.
Because I have it, untreated. And I couldn't even finish university because of it.
I'm unable to do certain things, like at all, I'm nearly physically ill when doing these things. Hard to explain it, to someone without these problems :)
Luckily enough, it's not that important here / Idc about money, career etc.
jondwillis 2 days ago [-]
Yeah, no way there’s any evolutionary fitness to having a brain that only works on problems it finds worthwhile. /s
mschuster91 2 days ago [-]
Ideally, society would be aware of such people and actually use their potential. AR struck it lucky and so did a few others (cough Richard Stallman cough), but most don't and end up burned out by rigid megacorp structures and processes that don't respect that people, even those one might call "neurotypical", aren't cogs in a machine.
I've said it before and I will keep saying it again: the financialization of everything and the utter dominance of braindead, long-since disproven MBA ideology is going to seriously impede our societies in the next decades.
contrarian1234 3 days ago [-]
I never understood this project. Maybe I'm missing something, but the timescale is such that by the time they're done the product isn't even being sold anymore
At least with Panfrost it made more sense bc it still being used
M1 chip laptops can only be bought second hand at this point
adrum 3 days ago [-]
I believe Walmart has a deal[1] with Apple to sell[2] M1 MacBook Airs. This has been the case for a year or so, so I don't think it's old stock. They have been in stock for since that date, and slowly getting cheaper.
Oh interesting! I for some reason thought Apple devices can only every be purchased from Apple (maybe that was only during the Steve Jobs era)
But 8GB of RAM.. that's unfortunately completely unusable by most developers. (Panfrost drivers you can at least use on RPi-like devices)
Maybe in another 5 years it'll work on the M3/4 and I'll revisit this. Good to know the devices are still being built so long after release
MBCook 2 days ago [-]
They’ve always been available at Best Buy, BH Photo, and other authorized partners in the US.
The Walmart deal is a total mystery. It started, seemingly, as dumping new old stock without selling it on Apple.com, but they’ve even updated the machine I think so clearly it’s an ongoing concern.
Nothing like it I know of for Apple, ever. I’d love to know the story.
lotsofpulp 2 days ago [-]
I would guess Apple’s goal is increasing the number of people buying recurring Apple services, such as icloud and tv+.
It has been a long time since people have needed cutting edge laptops, so an M1 bought today will still work for 90% of people for the next 5+ years. Even if Apple doesn’t earn a large profit margin on the sale of the laptop, they could earn a decent amount on monthly services revenue, plus increased odds of that person buying a watch/airpods/phone/etc.
MBCook 2 days ago [-]
I would agree except the machines are so low spec I don’t think they’re a good experience.
An M1 is great. But RAM and storage won’t hold up as long.
I suspect they can sell them at that price and still make a killing, and all the equipment to make chassis/etc is already paid off.
lotsofpulp 2 days ago [-]
For what purposes will the RAM and storage not hold up?
I see most people around me watching media, using a web browser to shop, maps, look at photos/videos (small storage is great for Apple, then more people buy icloud), fill out pdfs, and maybe some email or light excel.
Presumably, those are the people likely to buy a laptop at Walmart.
wpm 2 days ago [-]
Storage can at least be expanded externally, if at a cost of speed, reliability, and convenience.
For the RAM, 8GB is not enough, but in fairness, when the system can page out at 200GB/s, paging out doesn't hurt nearly as bad. Its only when things have to thrash the page file that it becomes readily apparent on these (say, an application needs to have more than a few GB of stuff resident in memory all the time).
MBCook 2 days ago [-]
I agree. If you’re using a lot of cloud services (Google Docs, iCloud Drive, Spotify, etc) you can get by without a lot of storage these days.
But even if 8 GB of RAM holds you today, will it hold you five years from now?
Or are you going to have to get rid of the computer much faster and buy another one by then.
Whereas simply doubling the RAM would likely extend the life a significant amount.
solarkraft 2 days ago [-]
8 Gigs of RAM still give you a great experience if you don’t do everything at once like I do.
It’s not high spec for sure, but with M1 RAM counting double (they swap very efficiently up to a certain point) it’s still plenty for casual use.
aloha2436 3 days ago [-]
Why would the product have to be available new for the project to be worth it? There are still many M1 chips out there, and this helps prolong the usefulness of those chips.
ac29 3 days ago [-]
> M1 chip laptops can only be bought second hand at this point
New M1 Macbook Airs are still available at Walmart (maybe elsewhere). But even if not, who cares? People are still writing code for computers that haven't been sold since the 1980s.
contrarian1234 10 hours ago [-]
Yeah, I get if it's just a for-fun kind of thing. I didn't mean to knock it. I guess generally people want their work to end up being used. The time-line is such that that seems semi-impossible.
As the other replies show, you can still buy this machine, but it sounds like it likely won't be for too much longer
The developers involved must be acutely aware of it. Maybe they have some sense that the work will easily update to current M chips. Or maybe they don't really care about that. It was just an interesting exercise and they move on
ivolimmen 3 days ago [-]
Hum.. I can still buy a 100% hardware compatible NEW C64...
norman784 2 days ago [-]
You can repurpose a decent hardware, in terms of performance and consumption, to run for far more time than Apple is willing to support it?
syabro 2 days ago [-]
Why only m1? Just installed stuff on my M2 Max
Some texts says m3 also supported...
dezgeg 2 days ago [-]
When the next gen chip comes out you can usually reuse large amount of stuff.
contrarian1234 2 days ago [-]
"usually" being compared to what..?
Maybe it's just due to a complete lack of attention, but I think M3/4 support is extremely minimal at this point. Which is not a great sign..
ActorNightly 3 days ago [-]
Im glad she stepped away from Asahi linux. Its absolutely great from a techincal perspective and the progress that team has made, but talented people like her shouldn't be trying to reverse engineer software/hardware from shitty anti-consumer company that can make the entire project work in a heartbeat by publishing documentation, in lieu of building better stuff from the ground up.
beagle3 2 days ago [-]
Reverse engineering requires a different mindset and somewhat different skill set than “forward” engineering. I’ve met people who were happy to only do reverse engineering (to figure out what make things “tick”) without building anything new.
If it was up to me, 2 years of successful reverse engineering (of a variety of projects/products) would be a requirement to be called an engineer. You learn a lot from working things that you can’t learn from a book (and without having to do the mistakes yourself first…)
Just to make it clear: I am not implying anything about Alyssa - just stating an observation based on my own experience.
allenrb 2 days ago [-]
If I could make roughly the same comp, I would jump on an all-RE job without a parachute. Sounds like heaven.
ActorNightly 2 days ago [-]
I mean, she wasn't just reverse engineering, she was doing it to make graphics work on Asahi Linux.
ronsor 3 days ago [-]
> in lieu of building better stuff from the ground up
To be fair, even if you have the best CPU and GPU designers, it's not as if you can call up TSMC and have them do a run of your shiny new processor on their latest (or even older) process. You can't fab them at home either.
overfeed 3 days ago [-]
Fortunately for her, Intel - her new employer has "fabs at home". Though on older nodes, TBF.
distances 3 days ago [-]
Intel's GPUs are manufactured on TSMC though.
overfeed 2 days ago [-]
That is correct. In my previous comment, for the sake of brevity, I deleted what I had written about Intel also being able to "call TSMC and have them doing a run of it's latest design", but it felt like that would have been belaboring the point that Intel isn't a rinky-dink chip operation, despite losing their commanding lead.
amiga386 1 days ago [-]
Ultimately we didn't need Jon Lech Johansen's work (and Derek Fawcus's, and others') on cracking DVD DRM or cracking Apple's FairPlay DRM, as there have always been alternatives. But their efforts did push the fight against DRM in our favour, and who knows what the world would be like if we had done nothing?
Creating things is a gamble, as mass adoption is almost never by technical merits, but by marketing. So you could make open documented everything but still end up with nobody benefiting from that openness, because a competitor (whether open or not) wipes you out. You saw this happen even in the era where electronic devices were expected to come with full schematics -- there were winners and losers even then.
But, if something has become widespread and well adopted, and it's not open, that's a problem. It absolutely should be opened up and documented. Especially if it's not because the money-grubbing creators of the something are deliberately hiding how it works and locking down control in order to extract more money from everyone else's pockets. The sooner you put an end to that, and the more often you fight against that, the sooner society itself becomes more efficient and fairer for everyone.
kmeisthax 3 days ago [-]
Even with proper documentation, there still would have been loads of work to get M1/M2 GPUs working on Asahi Linux. Writing GPU drivers worth a damn is about as difficult as targeting a compiler to a new CPU architecture. It would not be "in a heartbeat".
ActorNightly 2 days ago [-]
Im not talking about chip level documentation. Apple could take their source code for drivers and compile the kernel level for linux (since its all just C code) while open sourcing the user space.
jimmydoe 3 days ago [-]
Lucky you, Intel.
yfhuli 2 days ago [-]
Amazing work! Panfrost driver was very impressive to me before, and now just know the lady also solved the GPU driver on Mac. Not sure Intel will be a good career path for her. :)
giancarlostoro 2 days ago [-]
Interesting that it cuts off at the M2, I guess the M3 was a massive enough shift that it did not transfer over as nicely. I certainly hope that the M3s and M4's are supported by the time Apple EOL's my M4 laptop so I can still slap Linux on it. The M4 series of Macbooks is one of the nicest laptops I've ever had the pleasure of owning and using.
manav 2 days ago [-]
Very cool and what an amazing way to learn so much beyond what you do in those years at university (at least in the engineering sense). I wish we had software and hardware open to be able to run custom setups whether its NVIDIA on x64 or Ampere/arm64 with MacOS.
londons_explore 2 days ago [-]
The Apple GPU stuff is amazing, but I wish the rest of linux-on-apple work did better.
For example neither suspend nor hibernate works in Apple hardware, which means if you put the laptop away for a day or two the battery will die and you'll lose everything you were doing.
brcmthrowaway 3 days ago [-]
Does Xe HPG compete with NVIDIA?
wmf 3 days ago [-]
Xe HPG is also known as A750 and B580 so yes, it competes with the 3060/4060/5060.
Quizzical4230 3 days ago [-]
What a legend in the making!
miguelxpn 2 days ago [-]
I can't wait to see what she accomplishes at Intel. She has a very bright future ahead of her!
avbanks 2 days ago [-]
Congrats! A great achievement.
phkahler 2 days ago [-]
I wonder if the Intel drivers are going be written in Rust.
pbkompasz 1 days ago [-]
He did a great job
tiahura 3 days ago [-]
Congrats on a job well done.
spamjavalin 2 days ago [-]
What a legend.
anubhav200 3 days ago [-]
Great work
BuildTheRobots 2 days ago [-]
> I’m on the board overseeing Linux graphics. Half of us are trans. If all you care about is Linux, resist the attacks on trans people.
> If you have any decency, fight back.
By the way, out of pure curiosity, why does it seem that there are a disproportionately high amount of super talented trans programmers? I mean, trans people make up for a relatively small percentage of the general populace so it would be unlikely for half of any random group, even if it's only 7 people, to be trans. There are even memes like programmer socks etc. I also personally know several very proficient programmers who are trans.
stouset 2 days ago [-]
I am nowhere near an expert or even a particularly well-informed person here, but my idle speculation is that online technical spaces are a place where you can have much more freedom in comfortably choosing the outward persona you wish to present. In meatspace, if your physical appearances and your chosen persona don't match up in ways that some people find disagreeable, many people will go out of their way to let you know it.
Edit: I'm proposing a selection effect here, where people with these challenges gravitate towards spaces and communities where they don't feel confronted by them as often.
kevincox 2 days ago [-]
But by this logic roughly half of the population is trans, but the vast majority are afraid to present as such. The actual number of trans people is surely impossible to know due to these societal pressures but I find it hard to believe that almost half of the population is trans. I suspect there are other factors.
itsmek 2 days ago [-]
My read is that post is not arguing what you think (that it's caused by freedom to present how they feel and that it's a representative population) but instead that it's caused by a selection effect. But this argument is implied so I see why you mistook it.
stouset 2 days ago [-]
That's correct. I suspect there's a selection effect where people who struggle with these challenges in their lives find comfort in a space where they can present as the person they want to be without constantly being questioned, attacked, or otherwise made uncomfortable. And that's not because this industry is more enlightened, but because it's much easier to contribute and participate only with an online persona that can be divorced from your physical characteristics.
I hope it's clear that I am not trying to speak authoritatively and that this is more or less a total guess on my part based on some pretty superficial analysis.
BuildTheRobots 2 days ago [-]
Post hoc ergo propter hoc. I think your logic is backwards. Parent isn't saying the freedom of online technical spaces turns people trans, but that trans people in those spaces are more likely to make themselves obvious because of the freedoms.
There's also high overlaps between trans and non-neurotypical types. I'd suggest there's the same crossover between non-neurotypical and tech circles so makes sense there's a higher instance of trans in tech.
I'd also suggest that on some level coming out as trans is basically hacker mentality. "This hardware/wetware doesn't look/perform correctly. Let me get a soldering iron/rewrite some code/clothing/hormones/surgery and change that."
majorchord 2 days ago [-]
> disproportionately high amount of super talented trans programmers
Could it be that they also happen to be autistic? There appears to be a pattern I've noticed where a large number of trans developers are also autistic. This may be responsible for a lot of the passion and dedication required to work on projects like this. (Mostly video game) Emulation is another field that seems to have a similar overlap. Some might say Rust is another one.
abustamam 2 days ago [-]
I'm unclear what the relevance of this comment is. It's by the same author but doesn't seem to bear any relevance to the M1 GPU.
Refreeze5224 2 days ago [-]
The author is incredibly talented, and has done the community with a valuable service; one that the HN crowd especially is likely to benefit from.
Many people in her situation are not lucky enough to be in a mental or physical position to be able to pursue this sort of work or take advantage of their talent, and that is in large part due to the persistent and long-standing discrimination trans people face.
Even if one is totally self-interested, it pays not to discriminate. Even if one can't muster up even a little decency to not discriminate because, ya know, it's wrong.
abustamam 2 days ago [-]
I agree. But I wouldn't have known that the author was trans had the commenter not pointed it out. My thoughts about the original article are unchanged, but if the goal is to not discriminate, then pointing out a feature that many people would discriminate against seems counterintuitive imo.
If the original commenter said something like "it's inspiring that a trans person could overcome all odds and have such an accomplishment" then that comment has value. Randomly pointing out that the author is trans with zero context has no positive value.
altairprime 2 days ago [-]
Their quote was easily translated for me:
“Asahi Linux wouldn’t exist without trans people. No one realizes how much Linux progress, and especially how much Linux graphics/gpu progress, hinges on trans people. Perhaps the selfish motivation of wanting Linux to succeed will make HN readers less placidly tolerant of trans hate in the future.”
But, as one of those “wait, you’re trans?!” people who silently contributes to projects without labeling myself, I get how it could be confusing. Hope that helps!
abustamam 2 days ago [-]
Thanks! I understood the quote in itself, but the OP quoting it with no additional context could have had two connotations: one nefarious (don't bother with this article because it was written by a trans person with an agenda) or one of positivity (Linux wouldn't be where it is now without trans people).
I always hope for the latter, but context and mood are impossible to communicate easily online which is why I was asking for clarification.
altairprime 2 days ago [-]
There’s a lot of value in selecting the best faith interpretation and then replying in that context; for those silent readers who might be confused, it helps clear up their confusion to the disfavor of any underhandedness — which is then forced to either identify itself plainly (and promptly get flagged) or lose the battle and move on.
abustamam 3 hours ago [-]
That's good advice.
fourside 2 days ago [-]
Not the OP but my guess is that they want to raise awareness that people who are making significant technical contributions in our field are affected by the actions of the US administration. Not that exceptionally talented people deserve more legal protections, but the reality is that our industry would be worse off without people like Alyssa.
I know it can be at times grating to constantly hear about stuff like this, but I can assure you folks like Alyssa would rather focus on techie stuff rather than have to ask for help so they can stay safe. For them that worry is now always present and it’s not something they can ignore.
abustamam 2 days ago [-]
Thanks. I agree with your sentiment. But the OP just posted a random quote from one of her blog posts with no context. It could have either been to raise awareness about her being trans (honestly I'd have never known had it not been for the comment), or to discount the original article because she is trans.
I hope it was the former, but that's why I didn't want to just assume the OPs intention.
klelatti 2 days ago [-]
> I’m 21 now. I’ll be blunt: if not for gender-affirming care, I don’t know if I would be around. If there would be FOSS graphics drivers for Mali-T860 or the Apple M1.
The board is 7 people. I'm unclear what makes this awkward or inconvenient.
alt187 2 days ago [-]
The proportion of trans people in humanity is less than 50%. Pointing this out makes it seem the board has a disproportionate number of trans people.
abustamam 2 days ago [-]
And HN has a disproportionate number of tech enthusiasts... We can point out random disproportionate number of X in any Y population; it doesn't make it relevant to GPUs or Asahi Linux
nemomarx 2 days ago [-]
I mean it probably does? A somewhat disproportionate number of people in software are trans.
However, you could look at it the other way - if you take any group of 5-10 people I bet they're disproportionately something. Very few groups are selected evenly from all of humanity.
stouset 2 days ago [-]
Okay, and?
Refreeze5224 2 days ago [-]
Are you not aware of the very casual prejudice you're showing with this comment, or do you not see a problem with it?
Regardless of your political or religious views on people's right to exist, simple politeness if nothing else should prevent this sort of comment. I assume you wouldn't say it to their face, so why say it here?
I would not have known she was trans if it weren't for random commenters randomly saying "she's trans and just wants to blend in"
I agree with that statement, but consistently calling it out kinda seems to be having the opposite effect.
nemomarx 2 days ago [-]
What's the negative stereotype we're worried about here? "trans women are hackers" or something? Linux secretly makes you trans?
The banker thing would be bad optics because of conspiracies about it, but I'm not aware of an equivalent.
BuildTheRobots 2 days ago [-]
"It would have been the year of the Linux desktop too, if it wasn't for those pesky trans hackers!"
stouset 2 days ago [-]
How magnanimous and thoughtful of you to try and make it seem like your original comment was about protecting trans people who all just want to blend in and not be seen.
Except, you know, these specific people are openly trans and explicitly choosing for themselves to make a point about it.
2 days ago [-]
airtonix 3 days ago [-]
[dead]
thebharathpost 3 days ago [-]
[dead]
reader9274 3 days ago [-]
[flagged]
lysp 2 days ago [-]
Have a read of other posts and see what you might want to edit in yours.
Don't get me wrong, Intel's outlook is IMO currently indeed rather bleak, but I would not completely write it off just yet.
And Apple, to complete the circle.
I think at best you could say it's more challenging or perhaps risky being a bit restricted with IP, but I'd call it miles away from a "graveyard".
You can hardly call Intel/amd/qualcomm etc all struggling due to the architectures being locked down.
Look at powerpc/Isa. It's (entirely?) open and hasn't really done any better than x86.
Fundamentally you're going to be tied to backwards compatibility to some extent. You're limited to evolution, not revolution. And I don't think x86 had failed to evolve? (eg avx10 is very new)
Apple does have open source projects. https://opensource.apple.com But the scope is rather limited. For someone of Alyssa's skillset there really isn't anything there.
You can already do this work on M1/M2 using Asahi. A compute server doesn't need fully working peripherals and external displays.
My point was that the graphics division itself will still be around, as integrated mobile SoCs are basically the only revenue stream Intel still has a good handle on. That requires a graphics core, and all of the other usable options are either not for sale to Intel, have burned Intel in the past, or are owned by Arm.
She has shown a whole new generation that curiosity and persistence can break barriers. I thoroughly enjoyed watching the developments these past several years. Massive respect to her and everyone who made this possible, and kudos on her new position at Intel.
1: https://web.archive.org/web/20250520182445/https://rosenzwei...
It is accepting a new challenge.
how tf does she juggle and managed to do all this? I can barely do one of the above properly.
Although most likely she’s well compensated, and doesn’t have to waste time on useless efforts at work, this level of discipline and striving towards a goal is just very rare in general.
Possibly also no family, limited social life and no other hobbies.
However, discipline is an enormous factor too, actually using that extra available time on something “productive” is no easy feat.
Now I have kids and live in the same area as my parents and siblings again, entirely happy, but less free time.
Every person is different of course, there might be this one brilliant engineer forced to manage against his will somewhere.
Half of me kinda wants another lockdown so I can do more discipline-y stuff but the other half is like, dude you're just gonna waste it playing more games. I just gotta face the music - I'm just not disciplined and I just don't have the drive.
[1] https://rosenzweig.io/resume-en.pdf
Their work has inspired me to continue bashing away at my Zig PinePhone code, although I'll never have the skills to get it's GPU running anything beyond a poke'd framebuffer.
That checklist of supported APIs in Asahi is mind blowing, especially in such a short timeframe. Again, well done, thank-you, and best of luck at intel.
Thanks for all your amazing contributions Alyssa and all the best for the road ahead!
Well done.
Looking at the drama and people stepping down, I don't think MacBooks will be properly supported on Linux in this decade.
(The M3/M4 are in progress but not supported. That's public on the project's compatibility chart.)
Also the infrequent random OS crashes were annoying. And sometimes WiFi would stop working after sleep (wold not show any access points) and would require a reboot.
M1 is 5 years old already and is still not fully stable and lacks features. It seems like the overall development effort started slowing down a couple years ago and while we did get the amazing audio daemon and graphics driver, development of other things seem to be stuck.
If I remember correctly, there were also some comments from Marcan (?) on social media about issues with supporting newer chips (M3/M4), hinting that M3 and M4 are vastly different and require significant effort to add Linux support.
So if M3, M4 and other future versions are too different to get supported in decent time frame, then that means that Asahi is all about supporting years old hardware. That reduces interest by Linux users looking to buy a laptop now, and thus potentially reducing available donations, developer pool, interest, etc.
I love what Marcan, Alyssa, James and others have achieved and how they have pushed Linux further. I think that their contributions will stay relevant and be useful for other hardware for many years to come.
Now anyone that treats it with the attitude that whatever Linux distros do is UNIX, there are enough surprises in there.
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
https://asahilinux.org/2025/02/passing-the-torch/
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
While we still have quite a way to go, this progress has already made rebases significantly less hassle and given us some room to breathe.
https://asahilinux.org/2025/08/progress-report-6-16/
> With Linux 6.16, we also hit a pretty cool milestone. In our first progress report, we mentioned that we were carrying over 1200 patches downstream. After doing a little housekeeping on our branch and upstreaming what we have so far, that number is now below 1000 for the first time in many years, meaning we have managed to upstream a little over 20% of our entire patch set in just under five months. If we discount the DCP and GPU/Rust patches from both figures, that proportion jumps to just under half!
So if the discussions are true, it can take years for the developers to finish M1/M2 upstreaming with all the Linux kernel bureaucracy. That is, unless they decide to start working on M3 before finishing the upstreaming
Qualcomm has been beating the marketing drum on this instead of delivering. Ampere has delivered excellent hardware but does not seem interested in the desktop segment. The "greatest Linux laptop around" can not be some unmaintained relic from a hostile hardware company.
If you want to do a device, and your only chip option is Qualcomm I'd recommend not doing a device at all.
Can you see any other machine coming close to a Mac in terms of hardware quality and performance? Obviously the cost is silly, but while I agree with your sentiment, it seems optimistic to hope.
IME the Asahi support page is spot-on: There are a couple of yet-unsupported features (DP-alt mode being a big one), but any feature listed as supported will just work without hidden gotchas. I find this a big contrast to other devices, which will often "work" but have annoying little quirks here and there that are workable but can feel like a downgrade compared to Windows.
There's some room for improvement, but that is purely relative to macOS. Asahi still solidly beats other x86 devices (other than the low end ones you wouldn't do development work on).
One issue is that idle battery consumption is higher than on macOS (an active area of improvement though [1]), which you'll notice by an M1 laptop discharging by about 12% overnight when macos would've eaten maybe 2-3%. Not a big issue normally, but can be inconvenient if the device shuts down due to empty battery overnight.
During more passive uses at daytime (e.g. playing music), the display tends to be the biggest power hog. Not really Linux-specific, but I actively turn off the screen when not needed hence (KDE lets you configure the power button to do so).
[1] https://social.treehouse.systems/@chaos_princess/11498433865...
the great thing is, you can!
Macbook pro display is one of the best laptop display.
(Better for the battery too, if you can keep most of the screen dark.)
Any sources for that? I'd be quite surprised if Apple had radically altered the architecture.
[1] https://developer.apple.com/videos/play/tech-talks/111375/
Man I wish i had half of the energy of this author.
Trolling will get you banned here, so please don't.
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
We can't not make mistakes. The best we can do is acknowledge when we make a mistake and do what we can to fix it.
> I censor whatever I like (dislike), I guess
If you knew how many comments I dislike on HN, you would no longer have that perception.
Beyond that, each M series generation also brings more of the system into the SoC. For example, the entire storage controller is part of the SoC in the M1, but the M2 brought in the trackpad controller as well.
Bringing more functionality into the SoC has many advantages but it does make it more difficult to target because you can’t just make use of existing off the shelf controller knowledge to apply to it.
And other great projects, like Corellium (Actual iOS VM, not that crap Apple makes) are hit hard with lawsuits etc.
(You know, great project for these people who is still RE iOS for 0days and report them to Apple, which is behind me long time ago, reporting 0days for peanuts, yeah right :) )
With all of Apple’s secure boot stuff they had more than enough ways tot totally squash running alternate OSes on the machines like a bug.
Instead they seem to have gone out of their way in a few places to make it not only possible but secure.
They’ll NEVER say anything publicly, or give documentation, but they’re leaving doors open on purpose.
If I remember correctly, Apple at the introduction of M1 made some explicit statements about the hardware not being locked down. Something along the lines of nothing preventing Linux to run on it.
Correlium was selling and distributing access to Apple’s software along with security bypasses.
Asahi is not redistributing any Apple IP, are using Apple sanctioned methods to run, and are not commercial.
She did the challenging stuff she cares about. One aspect of nerd brain often is that you can hyperfocus on challenging stuff, but can't get the motivation to work on stuff you don't care about - and even what would be a 20 minute task can end up taking days because of that. It's great that she has the self awareness to set goals, and step away once they're done.
I didn't have that in that age - and still sometimes struggle. I was lucky enough that my employer back then recognized my issues, and paired other people with me for doing the stuff I was not interested in, and now usually manage to load those issues onto other co-workers by myself.
(Mind you, I'm not talking about a matter of inborn temperament or character, much less a moral flaw! Rather, finding the compelling challenge even in "boring" tasks is a valuable skill and situational tactic that anyone should explicitly learn about and aim to acquire as part of becoming a mature professional, not a matter of morality or somehow being dismissed as "lazy"!)
I wouldn't be so quick to judge someone for ADHD.
Because I have it, untreated. And I couldn't even finish university because of it. I'm unable to do certain things, like at all, I'm nearly physically ill when doing these things. Hard to explain it, to someone without these problems :)
Luckily enough, it's not that important here / Idc about money, career etc.
I've said it before and I will keep saying it again: the financialization of everything and the utter dominance of braindead, long-since disproven MBA ideology is going to seriously impede our societies in the next decades.
At least with Panfrost it made more sense bc it still being used
M1 chip laptops can only be bought second hand at this point
[1]https://9to5mac.com/2024/03/16/walmart-m1-macbook-air-launch...
[2]https://www.walmart.com/ip/Apple-MacBook-Air-13-3-inch-Lapto...
But 8GB of RAM.. that's unfortunately completely unusable by most developers. (Panfrost drivers you can at least use on RPi-like devices)
Maybe in another 5 years it'll work on the M3/4 and I'll revisit this. Good to know the devices are still being built so long after release
The Walmart deal is a total mystery. It started, seemingly, as dumping new old stock without selling it on Apple.com, but they’ve even updated the machine I think so clearly it’s an ongoing concern.
Nothing like it I know of for Apple, ever. I’d love to know the story.
It has been a long time since people have needed cutting edge laptops, so an M1 bought today will still work for 90% of people for the next 5+ years. Even if Apple doesn’t earn a large profit margin on the sale of the laptop, they could earn a decent amount on monthly services revenue, plus increased odds of that person buying a watch/airpods/phone/etc.
An M1 is great. But RAM and storage won’t hold up as long.
I suspect they can sell them at that price and still make a killing, and all the equipment to make chassis/etc is already paid off.
I see most people around me watching media, using a web browser to shop, maps, look at photos/videos (small storage is great for Apple, then more people buy icloud), fill out pdfs, and maybe some email or light excel.
Presumably, those are the people likely to buy a laptop at Walmart.
For the RAM, 8GB is not enough, but in fairness, when the system can page out at 200GB/s, paging out doesn't hurt nearly as bad. Its only when things have to thrash the page file that it becomes readily apparent on these (say, an application needs to have more than a few GB of stuff resident in memory all the time).
But even if 8 GB of RAM holds you today, will it hold you five years from now?
Or are you going to have to get rid of the computer much faster and buy another one by then.
Whereas simply doubling the RAM would likely extend the life a significant amount.
It’s not high spec for sure, but with M1 RAM counting double (they swap very efficiently up to a certain point) it’s still plenty for casual use.
New M1 Macbook Airs are still available at Walmart (maybe elsewhere). But even if not, who cares? People are still writing code for computers that haven't been sold since the 1980s.
As the other replies show, you can still buy this machine, but it sounds like it likely won't be for too much longer
The developers involved must be acutely aware of it. Maybe they have some sense that the work will easily update to current M chips. Or maybe they don't really care about that. It was just an interesting exercise and they move on
Maybe it's just due to a complete lack of attention, but I think M3/4 support is extremely minimal at this point. Which is not a great sign..
If it was up to me, 2 years of successful reverse engineering (of a variety of projects/products) would be a requirement to be called an engineer. You learn a lot from working things that you can’t learn from a book (and without having to do the mistakes yourself first…)
Just to make it clear: I am not implying anything about Alyssa - just stating an observation based on my own experience.
To be fair, even if you have the best CPU and GPU designers, it's not as if you can call up TSMC and have them do a run of your shiny new processor on their latest (or even older) process. You can't fab them at home either.
Creating things is a gamble, as mass adoption is almost never by technical merits, but by marketing. So you could make open documented everything but still end up with nobody benefiting from that openness, because a competitor (whether open or not) wipes you out. You saw this happen even in the era where electronic devices were expected to come with full schematics -- there were winners and losers even then.
But, if something has become widespread and well adopted, and it's not open, that's a problem. It absolutely should be opened up and documented. Especially if it's not because the money-grubbing creators of the something are deliberately hiding how it works and locking down control in order to extract more money from everyone else's pockets. The sooner you put an end to that, and the more often you fight against that, the sooner society itself becomes more efficient and fairer for everyone.
For example neither suspend nor hibernate works in Apple hardware, which means if you put the laptop away for a day or two the battery will die and you'll lose everything you were doing.
https://web.archive.org/web/20250520182445/https://rosenzwei...
Edit: I'm proposing a selection effect here, where people with these challenges gravitate towards spaces and communities where they don't feel confronted by them as often.
I hope it's clear that I am not trying to speak authoritatively and that this is more or less a total guess on my part based on some pretty superficial analysis.
There's also high overlaps between trans and non-neurotypical types. I'd suggest there's the same crossover between non-neurotypical and tech circles so makes sense there's a higher instance of trans in tech.
I'd also suggest that on some level coming out as trans is basically hacker mentality. "This hardware/wetware doesn't look/perform correctly. Let me get a soldering iron/rewrite some code/clothing/hormones/surgery and change that."
Could it be that they also happen to be autistic? There appears to be a pattern I've noticed where a large number of trans developers are also autistic. This may be responsible for a lot of the passion and dedication required to work on projects like this. (Mostly video game) Emulation is another field that seems to have a similar overlap. Some might say Rust is another one.
Many people in her situation are not lucky enough to be in a mental or physical position to be able to pursue this sort of work or take advantage of their talent, and that is in large part due to the persistent and long-standing discrimination trans people face.
Even if one is totally self-interested, it pays not to discriminate. Even if one can't muster up even a little decency to not discriminate because, ya know, it's wrong.
If the original commenter said something like "it's inspiring that a trans person could overcome all odds and have such an accomplishment" then that comment has value. Randomly pointing out that the author is trans with zero context has no positive value.
“Asahi Linux wouldn’t exist without trans people. No one realizes how much Linux progress, and especially how much Linux graphics/gpu progress, hinges on trans people. Perhaps the selfish motivation of wanting Linux to succeed will make HN readers less placidly tolerant of trans hate in the future.”
But, as one of those “wait, you’re trans?!” people who silently contributes to projects without labeling myself, I get how it could be confusing. Hope that helps!
I always hope for the latter, but context and mood are impossible to communicate easily online which is why I was asking for clarification.
I know it can be at times grating to constantly hear about stuff like this, but I can assure you folks like Alyssa would rather focus on techie stuff rather than have to ask for help so they can stay safe. For them that worry is now always present and it’s not something they can ignore.
I hope it was the former, but that's why I didn't want to just assume the OPs intention.
The board is 7 people. I'm unclear what makes this awkward or inconvenient.
However, you could look at it the other way - if you take any group of 5-10 people I bet they're disproportionately something. Very few groups are selected evenly from all of humanity.
Regardless of your political or religious views on people's right to exist, simple politeness if nothing else should prevent this sort of comment. I assume you wouldn't say it to their face, so why say it here?
I agree with that statement, but consistently calling it out kinda seems to be having the opposite effect.
The banker thing would be bad optics because of conspiracies about it, but I'm not aware of an equivalent.
Except, you know, these specific people are openly trans and explicitly choosing for themselves to make a point about it.