On raw performance might, the M4 really does live up to Apple’s promises, should deliver. Single core is up about 20% compared to all M3 chips and more than 40% compared to M2. The generational computational leap from the previous M2 iPad Pro is at least a 42% jump on single-core and multi-core.
All this computational power….on iPadOS 😵💫
Right, like they don’t really have many AAA, the main thing holding this back is firmly the OS. I just truly don’t get it
Market segregation is worth it for them and the chips will be used in plenty of other hardware anyway, so dumping them in iPads doesn’t hurt, even if it’s mostly just marketing fit the products, nor does it necessitate a product change.
It’s a waste of computing power, though.
I have an M1 MacBook Air and barely ever actually used the CPU. Putting these chips in iPads, which are mostly used for drawing at most, is just a waste, and one of the reasons they’re so incredibly expensive. Apple could have just kept producing M1s and putting those in current iPads.
The reality is, there’s zero innovation in Apple products. The switch to M1 was really great, but everything since then was just “more M is more better”, utility stayed the same, price went up. Awesome.
It’s not a waste at all. The extra computing power allows them to get much better performance than previous model OR the same performance with half the power use. That’s pretty important in a mobile device.
It isn’t a waste if people buy it. Putting M4s in the iPad lets them market it to rubes who think bigger number is better without reading the spec sheet or understanding their own requirements, and if they’re already manufacturing M4s to put in other things, that’s one less production line that needs to run. Sure, they could release an iPad Cheapass Edition with an M1 in it and sell it comfortably at a profit for like $80, but the market for those is likely to be small, they won’t make nearly the overhead profit that the M4 iPad will, it requires an entire extra production line setup, and most importantly it isn’t flashy enough for Apple. They don’t want to release a product that feels cheap, even if it was specifically intended to be cheap. It’s bad brand optics and they care about that a lot. Let China sell a bunch of bootleg tablets to people that want them, they’re gonna do that anyway regardless if Apple gets in the train or not, and this way Apple isn’t tarnishing their product lineup with a PoorPad^TM
I agree completely with your comment.
just truly don’t get it
It’s the same reason Macs don’t have touchscreens. If they can both do the same tasks, why would you buy both? And LOTS of people buy both.
Perhaps with a more robust OS, such as Linux or macOS the battery and thermals would just not suffice?
I mean, an iPad is basically a larger phone, which I think can get hot enough if pushing it to its limit
Also I don’t think the RAM would be enough for intensive tasks, the device as it is could be pretty good for gaming though, if only the title list wouldn’t be a shit for the most part.
But at the same time, a MacBook Air doesn’t seem much bigger compared to the biggest iPad available.
Isn’t iOS just about heavily modified Unix clone? My jailbroken old iPad has /var/log and misc GNU directories, as well as an Apt package manager to access Cydia repos.
Not a clone, its kernel was once certified UNIX. It’s just a heavily modified UNIX.
It is, but it would be like saying Android is just another Linux variant.
What I want to stress in my initial comment is that the OS is so heavily modified and focused on optimization and RAM management, that it can’t hardly work for power users when multitasking is on the board.
Maybe they’ll finally announce something interesting at WWDC. I’m ready to be hurt again
I get it if you’re doing photo editing on an iPad. That stuff is still a CPU hog.
That said, the M3 is on an end-of-life manufacturing process, and now that these things are getting updated every 2 years, it just makes sense to put the M3’s successor in this thing. A Pro M2 is going to stick out like a sore thumb in 2 years, and the M3s are going to start to disappear from the line up soon.
That’s why they also announced a multi camera synced video editing functionality on the iPad version of Final Cut Pro. In theory it can make use of the CPU with a ton of compute involved in video editing, especially with many source videos. Other than that, though, it’s hard to marry that overpowered hardware with underpowered software.
Now you can load up Facebook 0.0001% faster
RUNNING IPAD OS?? Apple what is happening 😭
For Lemming harder.
If Lenovo was really clever they’d now spend some money on creating a Linux Desktop that is as polished and usable as MacOS and use truly Retina-level displays. I’m ready to ditch Apple like I’ve never been before.
In general, I would love for any OEM to step in and provide similar build quality to a Mac… doesn’t even have to be Lenovo (who IMO are a pale imitation of IBM’s line of laptops).
The Lenovo additions to the Thinkpad lines (like the foldable ones or tablet-hybrids) are pretty horrible, the classic ones are still good (T, P)
The Ultrabook X carbon or whatever they’re called are also ok for the weight.
I bought a used P51 and love developing on it because using Docker on an OS where it’s natively integrated is a game changer, but at the same time looking at the ugly font rendering on a dim 4k screen with huge 1 inch bezels spoils it again. Developing on a Mac feels less like work because of their attention to design.
I have an X380, it’s pretty decent for what it is. Sure, there are plenty of things I’d change (ram slots instead of soldered, 3:2 or 4:3 aspect ratio for the touch screen, maybe a bit lighter) but later gens actually have a few of those improvements. It’s not really a great replacement for an ipad, but it’s a pretty decent work machine (provided you don’t need a ton of power or ram).
I can only think of one thing you could do with this much power … run an LLM
I’m so annoyed they announced this.
I have a slew of raspberry pi’s kicking around, doing various things. I also have a name brand NAS that reportedly lets you run other software, including containerized apps, but their implementation is whack and doesn’t work super well.
I want to get a more powerful machine for use as a replacement server. I’d like to spin up my own LLM tools, use it to with software like photoprism to auto tag my pictures, or even spin up Frigate on it.My leading contender had been either a Jetson Orin nano or a system with the core ultra 155h chip. But now I might have to wait until they announce/release M4 Mac minis - which is really annoying because I want instant gratification for my half-baked ideas.
Now you have the time to actually write up a design document and let your half-baked idea become a fully cooked one before you drop a bunch of cash on it
What are you, some kind of financial advisor!?
…. because if you are, are you taking new clients? My shit is whack.For legal reasons I am required to inform you that I am not a financial advisor. In fact, I am not real.
Are you an LLM running on an M4?
We’ll find out the future of iPadOS in one month! They have raised the price on the pro models, hopefully they have a big ass update readied up or alll the reviewers are gonna say the same thing “great hardware let down by shit software”
They have been saying the same conclusion since the very first iPad, hasn’t deterred Apple yet.
I don’t use apple’s stuff but alternatives to X86 could be the future. The one thing they need is compatibility with X86 software otherwise mass adoption is heavily crippled. It doesn’t matter as much for Apple’s stuff since their whole ecosystem is under strict control but for general purpose consumer hardware that compatibility is required first.
Apple already stopped selling x86 devices and even the stuff that is not under their control seems to work fine
The performances is not inherent to ARM, x86 can definitely catch up to this.
In theory yes if they can handle thermals, but I don’t see that happening.
I have a friend who said on his M2 MacBook, even before the Apple Silicon build of Factorio released, the game ran better in x86 emulation than on his previous machine. And much cooler.
The battery life and thermals that come out of these powerful ARM chips are amazing, and anything that can be multithreaded is going to perform brilliantly on these chips.
Obviously for stuff where thermals and power consumption aren’t as important the gains aren’t as large, but I can’t remember the last time I worked on an actual desktop machine rather than a laptop with or without a docking station.
That heavily depends on what the previous machine was. Like factorio runs on my laptop without taxing the system much more than just idling and on my desktop I can’t even tell it’s running based on performance monitoring. So yea, I’m not sure factorio is a good indicator.
Sure, definitely not a perfect benchmark. I’m not saying it’s going to outperform a current x86 machine in general. But if it can perform as well as or better than a relatively powerful x86 machine from a few years prior, while emulating, that’s impressive.
But I don’t know, I don’t have a MacBook.
I’m pretty sure the old AMD APUs from the Bulldozer era can run factorio and that’s like a decade old.
Like sure, it’s some metric but I’m pretty sure any computer produced currently can run factorio.
I’ve got a high end Intel MacBook Pro and a low end M1 Mac Mini. The Mac Mini runs x86 apps live Civ 6 faster and smoother than the Intel MacBook can.
I don’t doubt it, Apple has never had good gaming performance. But a non apple laptop in the same price range with X86 aimed at gaming can run it a lot better.
If the leaked score is true, isn’t it beating every cpu in single core performance
In Geekbench, yes. From other reporting I’ve seen the major improvements here are from Scalable Matrix Extensions being on the M4, which Geekbench supports. Real world performance of which would be limited to certain scenarios and require application support for SME.
I see a lot of “Apple says” here. I’ll believe it when I see it. And not on their shitty graphs with no numbers and comparing it to 4 year old processors.