- cross-posted to:
- hackernews@lemmy.bestiver.se
- cross-posted to:
- hackernews@lemmy.bestiver.se
512gb of unified memory is insane. The price will be outrageous but for AI enthusiasts it will probably be worth it.
Weird that my mind just read that as MKUltra.
Maybe appropriate for AI.
Isn’t unified memory terrible for AI tho? I kind of doubt it even has bandwidth of a 5 years old vram.
While DDR7 DRAM is obviously better, the massive amount of memory can be a massive advantage for some models.
taking Apple prices to a new extreme
can be configured up to 512GB, or over half a terabyte.
Are you ok mate?
They’re not wrong. 1000 GB is a terabyte, so 512 GB is over half a terabyte.
It’s exactly half a tebibyte though.
512 GiB is half a tebibyte. 512 GB is just under 477 GiB.
Yup.
- 512 GB > 1TB/2 - what article claims
- 512 GiB = 1 TiB/2 - what many assume
- don’t mix GiB and GB
Correct. But that means 512 GB is not half a tebibyte.
Ah, correct. RAM used GiB, so I guess I implicitly made the switch.
That’s a retcon of hardware producers using measurement units confusion to advertise less as more.
It’s nice to have consistent units naming, but when the industry has existed for a long enough time with the old ones, seems intentional harm for profit.
How is it a retcon? The use of giga- as a prefix for 109 has been in use as part of the metric system since 1960. I don’t think anyone in the fledgeling computer industry was talking about giga- or mega- anything at that time. The use of mega- as a prefix for 106 has been in use since 1873, over 60 years before Claude Shannon even came up with the concept of a digital computer.
if anything, the use of mega- and giga- to mean 1024 is a retcon over previous usage.
That’s not a retcon. Manufacturers were super inconsistent with using it, so we standardized the terminology. For floppy disks were advertised as 1.44MB, but have an actual capacity of 1440 KiB, which is 1.47 MB or 1.41 MiB.
The standardization goes back to 1999 when the IEC officially adopted and published that standard.
There was a federal lawsuit on the matter in California in 2020 that agreed with the IEC terminology.
All of this was taken from this Wikipedia article if you’d like to read more. Since we have common usage, standards going back almost 30 years, and a federal US lawsuit all confirming the terminology difference between binary and decimal units, it really doesn’t seem like a retcon.
OK, fine, all the world might say whatever it wants, but my units are powers of 2.
I prefer it too, but just because “gibibyte” is a stupid word doesn’t mean it’s fine to go against standards.
Agreed, but do you pick the de-facto standard of the entire industry (minus storage advertising) or the de joure standard of an outside body that has made a very slight headway into a very resistant industry.
The reality is that people will be confused no matter what you do, but at least less people will be confused if you ignore the mibibyte, because less people have even heard of it
You pick neither, and enforce correct usage of both in advertised products. Tech people will adapt, and non-tech people will be confused regardless (they still confuse megabytes/sec and megabits/sec, and that’s an 8x difference).
Well this news means there will be cheaper second hand m1 and m2 machines on the market.
Unfortunately that market is already flooded with functionally-useless 8GB machines.
my college buddy and startup cofounder had a pathetically slow old laptop. he asks me the other day, “should i buy an ipad pro?” i was dumbfounded. bro you don’t even have a proper computer. we went around a bunch and he kept trying to get really bad ones like a base model mac mini. finally i persuaded him to get a 16" M1 Pro for a grand (about 700 after his trade in) and he couldn’t be happier.
I’m still using my M1 MBP like 4 years later. Don’t even care to upgrade! these things are great value
M2 user here. It is wonderful. You cannot get it to even heat up.
Honestly, the base level M1 mini is still one hell of a computer. I’m typing this on one right now, complete with only 8gb RAM, and it hasn’t yet felt in any way underpowered.
Encoded some flac files to m4a with XLD this morning. 16 files totalling 450mb; it took 10 seconds to complete. With my work flows I can’t imagine needing much more power than that.
I thought a few days ago that my “new” laptop (M2 Pro MBP) is now almost 2 years old. The damn thing still feels new.
I really dislike Apple but the Apple Silicon processors are so worth it to me. The performance-battery life combination is ridiculously good.
Yes, that’s how computers work. Like all other depreciating assets.
That extreme’s name? Albert Einstein.
Ultra brings memories, but Sun was better than Apple.
Though having said that, Sun was kinda greedy too and their hardware was about the same degree of proprietary as Apple’s.
Aesthetically Sun was amazing, though, and Apple is tasteless, aimed at plebes (technically Julius Caesar was from a plebe noble family, but nobody thinks in such nuance or that plebe noble family is a thing).
Is memory that small, connected externally, or does that SoC just end up being a large package, with that much RAM on it?
It’s just external and soldered to the motherboard on Macs, no?