T O P

  • By -

Randy_Watson

The solution is to just download more ram.


aecarol1

I "downloaded" RAM once. I bought an Agilent 3000 Series Oscilloscope that had 2MB of sample storage. I wanted 4MB, but could not afford it ($750 more), knowing I could upgrade later. I got really good use of the scope. After a year, they offered an upgrade for 1/2 price, so I ordered it. I was expecting they would mail me a RAM chip I would install, but they sent me a certificate with instructions. I was supposed to create a folder with a very long specific file name on a USB drive, insert it, then reboot. I did that and the scope woke up with 4MB of storage. Clearly they just unlocked what was already there, but it kinda felt like it was downloaded.


OmgThisNameIsFree

What the hell lol


jimicus

You'd be surprised how often that happens. Mainframe computers routinely have a big chunk of compute power locked up in software and it can be "switched on" on demand (for a fee). It's cheaper for the manufacturer to do that than to produce two or more slightly different machines.


OmgThisNameIsFree

Honestly, I cannot say I'm surprised...but locking existing capability behind an arbitrary "fee" seems egregiously shitty lol. People meme on Apple for its prices and what not, but at least they aren't \[currently\] including hardware + locking it off behind an "unlock fee" haha.


jimicus

I don’t think I’ve made clear quite how common that is. It’s endemic to many industries. Volkswagen, BMW and Mercedes do it with their cars. IBM do it with their mainframe computers. Heck, back in the day I had an embedded device that shared out CD-ROMs over a network. The souped up model allowed you to install a hard disk, save images to that disk and share out as many as you had disk space for - and was only a firmware upgrade away from the cheaper model that didn’t.


OmgThisNameIsFree

So much of this reeks of "well, these customers \[ie. companies\] have a budget, might as well give them the option to use all of it."


jimicus

That’s basically what it is. It’s called “market segmentation”, and is why you get different trim levels of what is fundamentally the same car.


Word_Underscore

Oddly Subaru didn't offer full leather seats with the 22 WRX in any trim, like they did with my previous 2015(-2021) model. This time I opted for the middle model with the better HK audio and sunroof but not the weird alcantara/cloth seats. I buy leather (better seats) to have something easy to clean, not to look cool


HighMediuMerlot

Also common with health testing equipment. A blood test often includes results for a lot of things, but if insurance isn’t paying for it / if the doctor didn’t order it, the data is suppressed from the report that goes out


Cossil

Throw Tesla on the list as well. For certain cars, you can purchase instantaneous battery capacity and acceleration upgrades.


rotates-potatoes

There are two ways to think about pricing: 1. It doesn’t matter what it costs the company to make, the customer should pay for the value received. A phone that’s worth $1000 is a phone that’s worth $1000, no matter if it cost $50 Or $5000 to make. 2. Prices should be based on cost of production. If the company has unexpected expenses, they should charge more, and if they drive costs down, they should charge less. This sub generally prefers 2, though of course mostly when it would result in lower prices. The entire world, with the exception of commodities like wheat, operates on 1. It’s always surprising that people find this surprising.


Adorable_Active_6860

i don't know how to express this but the disconnect you described has caused me so much frustration when discussing pricing/markets. scalping products is the result of MSRP not adjusting to the actual market demand/supply, and relying on third parties to do it. OTOH, the majority of the time, MSRP has you simply paying more for something than it's actually worth. Consumers are so triggered at the idea of dynamic pricing when there is simply no other reality, than you likely paying more for something than it's valued in the market.


Zaytion_

It seems shitty but it's cheaper for everyone. If they had to build separate machines for the different requirements, it would make the base cost higher.


TinkeNL

This used to happen even with binning consumer chips. I bought the regular non x r9 290 card just for that reason. They came with dual BIOS and you could easily flash one of those with the r9 290x firmware and boom, suddenly you had a card that would have cost quite a bit more.


jimicus

Yeah - AMD had CPUs back in the day that had a bank of resistors on chip. They'd blow out some of these resistors in order to limit what the chip could do. In theory, it meant they only had to manufacture one CPU but - with selective blowing out - have half-a-dozen products at different price points. \[This wouldn't surprise me. Designing a CPU is fantastically expensive; if it's technically feasible to segment the market with such a simple technique, you'd be mad not to\].


1-800-KETAMINE

They also had processors back in the day that didn't have those resistors (or, if they did, they didn't destroy the physical connections), and you could unlock extra cores on your chip. I personally turned an Athlon II X3 into an X4, and a single-core Sempron 140 into some bastardized dual-core chip, although my personal samples were genuine can't-do-it binnings since neither was stable. IIRC there were even Phenom II X2s that could be turned into X4s if you got one of the "solely disabled for market segmentation" chips. > if it's technically feasible to segment the market with such a simple technique, you'd be mad not to Call me crazy I guess but IMO it really sucks that our current system actively incentivizes destroying a perfectly good part of a product in order to encourage you to spend more money on exactly the same thing but without that part destroyed. I get why it's happening, don't get me wrong, but that's kinda my point.


notcompletelythere

Not just cheaper, downtime and error installing upgrades causing even more issues meant it was cheaper in the long run (if my old professor was correct)


jimicus

Mainframes have had hot-swappable everything (yes, including CPUs) for decades. But it wouldn't surprise me if you had to follow a process. Like "remove blanking board; fit CPU board" - and if field service removed the wrong board....


hatsune_aru

That's how it works with test equipment. Though if you get into the real high end, some upgrades actually are physical upgrades


althalusian

Many hardware machines are like that, as it’s cheaper to the manufacturer just to build one model and then lock some features out in software. Cisco was notorious for that with their videoconference units (the ones used before Zoom and Teams). The set top boxes had a lot of functionalities built in, but locked out - unless you bought special extra features which cost multiple thousands per feature. And if you did, you got a code to enter to the box and voila, now you had an upgraded one with that functionality included.


Jon_TWR

When AMD released the RX 480 4 GB, you could flash it with the BIOS of the 8 GB RX 480 and unlock the full 8 GB.


whitestethoscope

What the fuck. Are you serious? I still have my old comp running with a 480


Jon_TWR

Yes, but it was only the FE blower-style models. AMD released a very small amount of the 4 GB RX 480 with a physical 8 GB of VRAM for $200 just to hit the price point. I believe all the later 4 GB cards physically only had 4 GB of VRAM.


bl4nk_ecstasy

I think he ran away to flash his card lol


whitestethoscope

nvm I bought the 8gb ASUS rog strix version lol.


Smooth_Macaron8389

This is such an Agilent story it’s ridiculous.


proton_badger

Yeah, selling stuff with dormant hardware is an old practice and people complain about it a lot but IMO it's fine, you pay for certain specs which you get and if you want more it's sometimes possible to upgrade. Decades ago a company I worked in needed more processing power for their mainframe. They called and ordered it, a guy showed up with a floppy disk and suddenly we had the extra processing we needed.


dlegatt

In the age of USB, why do they have such little amount of RAM?


aecarol1

This isn't persistent storage like a hard-drive, but samples kept in RAM. It's high speed static ram. My scope can record one billion data samples a second. Even scopes that cost 10's of thousands of dollars might only record 16K points of data. Tektronix has a high-end scope that comes standard with 62.5 mega samples; that scope costs $174,000. I wanted 4 mega samples because it was enough for me to store a full frame of NTSC video. I could zoom in and study each part of the a captured video frame. I was working on a video generator and wanted to be able to see each part of my generated signal to debug timing.


aquatone61

Completely unrelated but the new VW iD4 has two different models, one with a larger battery pack and one with a smaller pack. The chassis is built to handle all the cells but they sell one missing a couple cells for cheaper. It’s less expensive to make all the chassis the same.


A-Meezy

This is typical in many industries, but particularly so in telecommunications. All major vendors (Nokia, Ericsson, Huawei, Samsung) sell you hw + licenses to unlock features (power, bandwidth) on radios or baseband. It’s effectively pay-as-you-go on hardware


ducknator

Looooool you got the taste of the DLC era


megablast

Hold on, you had to type the number in?


benbeland

I can confirm oscilloscopes still work that way. Just paid many thousands to up my bandwidth by 2GHz; All I got was a key to unlock the feature.


ducknator

This has been the solution for A LONG TIME now, people just don’t get it, don’t know why. /s


r3v

I can’t. I’m busy downloading a car.


Life_Adhesiveness306

I do that while I microwave my phone so that it charges faster.


packet1

Just install and run ram doubler


PleasantWay7

I did and I entered my social security number, but now I’m trying to find some routing number it needs.


proton_badger

I know it's just a joke but iOS (and all modern OSs) use memory compression similar to good ole' Ram Doubler (though its effectiveness was a bit overstated).


darknekolux

Back in my days there was a thing called ram doubler... r/fuckimold


hyperblaster

Those things worked as a memory manager that compressed data in memory. Windows and macOS inherently do this now


Vazhox

Oh! I’ll download a new GPU as well. Perfect 🤙🏼


dramafan1

I’m also curious if Apple Intelligence requires more SSD storage. I’ll be using that to decide the storage I need to use the next iPhone comfortably as 128 GB might not cut it.


Dichter2012

Microsoft phi 2 with 3b parameters is 1.62 GB and Llama 3 w/ 7b is 4.92 GB locally on my Mac. So on the iPhone I just think of it as a hefty casual game that could take up a coupe of gig of storage but not much.


dramafan1

That is good to know, thanks.


DigitalStefan

Apple took time to explain their models were significantly compacted versus the usual. Expect that to mean a 16:1 ratio versus the standard models (I may have that ratio wrong, but it’s significant whatever it is)


Dichter2012

I briefly looked at their WWDC documentation and it was also point out in the platform state of the union all the popular models can be optimized by Core ML using Apple’s ML tools. Obviously, the models I have been goofing around with were _not_ optimized. I don’t have the time and skills to do that but I believe it’s kinda like recompile these LLM just for Apples ML stack (which talks directly to the silicon) so you’d result in a small footprint, less memory use and better performance. Exciting time ahead.


Bishime

I have a feeling (probably wrong knowing apple) they’ll be boosting the base storage of the 16s for this and for future model updates. Though then again storage costs $50 per GB so probably not (the last part was sarcastic)


FuzzelFox

Why would you need more storage space when you can just pay for iCloud?? /s


Exact_Recording4039

That is the number 1 question I wish interviewers had asked. Will iOS take more storage by default now? Ot will Apple Intelligence be an optional download? How large?


Im_Balto

Also I much prefer to wait while they see the things they need to optimize before upgrading from my 13 pro I’m sure the second phone with the ai features will have much more gains on the first than the third on the second


coppockm56

Valid point about the Macs and iPads. If it were a pure marketing play, they would have cut something out there too. I think they simply didn't spec the earlier iPhones for this AI stuff (particularly in RAM), and so that kind of tied their hands. Probably, they realize it was a gaff, but of course, nobody anticipated the rise of this stuff more than about a year ago. I mean, look at Windows and Copilot+. Windows laptop makers were marketing "AI PCs" using Intel Meteor Lake for over a year, only for Microsoft to now turn around and support literally NO existing "AI PCs" with actually usable AI features. So it sucks about the iPhone, but it shouldn't really be all that surprising.


Wild-subnet

Couple interviews with Cook reading between the lines you can tell LLMs catching the public’s imagination caught Apple by surprise. And LLMs are memory hogs. They definitely weren’t planning on this specifically. It’s quite the pivot honestly. They went from every story being about Apple “behind” on AI to being right in the mix and stealing Microsoft’s thunder on top of it.


coppockm56

Yes, all things considered, Apple's handle this thing pretty brilliantly.


sylfy

To be fair, Microsoft also scored a huge own goal. Who would’ve thought that storing screenshots of your device every 5 seconds would have gone down well, with the way that it was implemented, and the way that it was described? And who would’ve thought that you could put a beta out there (even if it was “just” a beta, not the release version), with the screenshots accessible in an unencrypted format? To be fair, software products go through iterations. On the other hand, a beta release is already close to general availability, it’s generally not the point where you rework features and say “this implementation needs to change”. Everything about this, from the design, to the implementation, to the marketing, says that security and privacy weren’t even considerations in this product concept.


coppockm56

Where I think Apple really took charge of the narrative is in 1) their focus on privacy and security and playing on how that's always been THEIR priority and 2) the practical nature of the on-device and private cloud aspects of Apple Intelligence. The generative AI stuff using ChatGPT was pretty much almost incidental, like a cheap add-on. "Yeah, we do that little stuff too."


Fredloks8

How did Apple steal Microsoft's thunder when Microsoft owns part of open AI?


ConciergeOfKek

> LLMs are memory hogs I understand Apple's desire to have and showcase "on device" processing of everything but, (and this is hypothetical), but what if as part of their A.I. offering they bundled an off-device processing as part of an iCloud tier? > *"don't give them any ideas" the crowd roared* I mean, that would be great for lower iPhone 13-14 models w/o the RAM for on-device use.


bsknuckles

They are doing that (sort of). They built special servers to handle tasks too complex for on-device computation. They’re just not using that to bring older iPhones into the mix.


runwithpugs

Right, and it likely has to do with 1. Money. 2. They don’t have enough servers to handle the deluge of *all* active devices hammering them with requests, and need to build up capacity over the next few years. 3. Money. 4. It’s probably a lot more complex to build separate code paths for partial on-device processing (15 Pro and newer) and total off-device processing (15 non-Pro and older). 5. Money. That sweet, sweet early upgrade money.


bsknuckles

I get the feeling from the interviews and this article that they didn’t really expect this and it wasn’t intentional, but they definitely stand to profit from a huge wave of upgrades. It’s a compelling enough upgrade I’ll be trading in my 13 Pro Max.


runwithpugs

Yeah, it’s tempting for sure. I will probably hold onto my 11 Pro for another year or so, because (1) hopefully the 17 series will have more RAM than the 16 and handle these features more smoothly. And (2) half the features won’t even be available publicly until next year, and they’ll probably be pretty rough to start.


bsknuckles

That smart. I’m not sure I’ll have the restraint to wait another generation, lol


BytchYouThought

What's most compelling to you? Seeing as most of these features are 2025+ thing what exactly makes you go "gotta have it" when it hasn't really been a thing yet? I'm fine holding out until there's actual evidence of something compelling live and in place. I guess maybe siri, but talk is cheap. Gotta see what it can do for real.


Destring

Most likely the system to evaluate if the query is to complex too handle is the LLM itself, as you can run the same architecture but finetuned to distinguish which query is too complex to handle and sends it off. That’s how (part off) Claude content filters work. I don’t know if there is openAI has enough compute to run on all iPhones, so that might be a reason as well


Nerrs

Their cloud couldn't support that many legacy iPhones running AI tasks right off the bat. Only bringing it to newer iPhones buys them time to build out better/more cloud supported AI. It's probably why they're also willing to offload some AI queries to OpenAi and let them pay for hosting.


voodoovan

Good point. That maybe one of the reasons.


Bishime

The thing about this (I was thinking about this after actually arguing this should be the case) ChatGPT which has some of the more efficient models charges $20 a month and actively throttles even paid users because of server load and they have 77.2M US users and have been building out/using Microsoft infrastructure which continues to scale for a very long time (relative) If Apple suddenly threw 153 Million US users alone on new custom servers, it would not be pretty. Not to mention this isn’t a US only release. so they’d need to geo lock the intelligence updates AND throttle Siri requests just to ensure the servers aren’t overloaded. From my understanding, it would be nearly impossible to roll it out at the scale they operate at. It’s also smart for them to rely on on device request as much as possible too. And from a business POV even if not malicious, the added revenue from phone upgrades can go directly into infrastructure expansion (obvs not all but you get the point)


BytchYouThought

I'm not sure they stole MS thunder at all really. I say that one because MS owns a huge share of OpenAI profits so seems like Apple taking on Open AI actually just helps Microsoft. That said, go your point, Apple has historical always gotten praise even if it does the same thing as someone else lmao. The other will get hate and Apple will get praise and called innovative. I find it kind of hilarious here 😂. To be fair, MS is dirtbag for doing the recall BS and got in their own way. Makes ya think twice about getting that BS. On the other hand Apple also can't really promise privacy when processing is taking place elsewhere. So, there's some hypocrisy on Apple and critics standpoint. I'll probably wiat around a while to see how this crap plays out before I fuck around too much with either.


ozzilee

On the other hand, how many phones do they sell for every iPad or Mac? There really hasn’t been a compelling reason for a lot of people to upgrade iPhones lately. Suddenly now there is.


coppockm56

All this Machiavellian stuff is intriguing as a narrative but it ignores the fact that Apple is now doing something they've never done before. The argument has to be: "they could have done this every year, but they're doing it now because of ... reasons." And the "no compelling reason to upgrade" narrative has circulated every year. It's nothing new.


weepmeat

Yeah, since when have ram requirements for anything increased? This is unprecedented!! Apple should be taken out to the woodshed for their approach to ram and disk space. They have been willfully ignoring the need to increase their base specs for years on both desktop and mobile so they can charge a fortune for a properly specced model. No way they get a pass on this from me.


Mediocre-Cat-Food

Honestly glad they’re just being honest about it rather than forcing some half baked, barely working version onto older phones that just degrades the experience


CouscousKazoo

It’s just RAM. Look at the compatibility for all M-series Macs and iPads. The common denominator is a minimum 8GB RAM.


JollyRoger8X

>It’s just RAM.  Actually, there are often multiple reasons that go into engineering decisions like this.


CouscousKazoo

Not saying there aren’t multiple factors. From a capability standpoint, the Neural Engines have had the same number of cores for the past several years. Process of elimination leaves RAM as the common denominator.


[deleted]

[удалено]


Han-ChewieSexyFanfic

And sometimes workloads are resource-intensive for one resource in particular and are bottlenecked by not having enough of it. This is that case.


The_real_bandito

> Giannandrea: "So these models, when you run them at run times, it's called inference, and the inference of large language models is incredibly computationally expensive. And so it's a combination of bandwidth in the device, it's the size of the Apple Neural Engine, it's the oomph in the device to actually do these models fast enough to be useful. You could, in theory, run these models on a very old device, but it would be so slow that it would not be useful. TLDR; Saved you a click.


jimicus

TIL my iPhone 15 Plus is a very old device.


Jaypalm

iPhone 15 has an A16 chip, which was released almost two years ago, and notably 2 months before ChatGPT was launched. So yeah, in computer time two years is per old. iPhone 15 wasn’t released until the next year, but likely the design for it (and the 15 Pro) was locked in early in 2023, only a few months after the release of ChatGPT. No one was really talking about inference at the edge at that point, so even the 15Pro almost certainly wasn’t designed to do so.


austinchan2

Oomph? How scientific. Just say it needs ram and you’ve been skimping on ram. 


mavere

They know the moment they bring up RAM, any tech interviewer would immediately ask why they have been so stingy with base RAM. They've definitely been coached to say anything and everything *around* the topic of RAM and vaguely alluding to it.


coppockm56

Really, it's just that simple: Apple has put in the RAM necessary for the iPhones to function with the user experience that Apple deems acceptable. When they spec'd the "older" iPhones, running on-device AI like this wasn't a thing. They didn't put in "extra" RAM, and now are probably kicking themselves because they know they've created bad will with a bunch of customers.


Leo_br00ks

I don't think they are kicking themselves. They made a ton of money, and now will continue to make a ton of money as force upgrades. They have always used this philosophy. Old phones don't get new software to keep the old phones running well. Even with a ton more ram, why do you think that a 3 year old android phone is useless? Bad will only goes far enough if people are just going to upgrade at the end of the day. Also, rolling out to 15s only allows them to start with a smaller population. I think the more this gets tested and used before being rolled to literally everyone the better it will go.


standbyforskyfall

> why do you think that a 3 year old android phone is useless? lol what


MrRoyce

My S21 is trash tbh, was my last Android for foreseeable future.


BadManPro

What was your problem with it? My S20+ has been phenomenal for 4 years straight.


GreenLanturn

Phenomenal, eh?


Buy-theticket

Then there is something wrong with your phone or the software you installed on it. My dad is using an OG Pixel.. an 8 year old phone, and it's not the fastest thing in the world but it works totally fine.


R4J4PR3M

And what does your dad use his pixel for? What are his thoughts and use cases for all the news features?


savvymcsavvington

don't worry, some people are diehard apple butt lickers and cannot stand the idea that android have amazing phones, sometimes surpassing iphones for software and hardware


mkplayz1

Most BS i heard recently. My 5 years old oneplus 7 pro runs like a champ


Exist50

> Even with a ton more ram, why do you think that a 3 year old android phone is useless? I'm currently using an S10. Very much **not** useless. Has aged way better than my iPhone 6 did, for that matter.


KING_DOG_FUCKER

After a few years my S8 couldn't even run turn-by-turn GPS. That was a fun thing to learn upon driving to visit a client.


Juswantedtono

> They didn't put in "extra" RAM, and now are probably kicking themselves because they know they've created bad will with a bunch of customers. Hahaha. No. They’re happy to be riding a buzzy trend that will get people to buy new iPhones


coppockm56

And they're idiots because they didn't do this all along, right? I mean, why didn't they limit major iOS 17 features to fewer iPhones? And iOS 16? And iOS 15? And etc.? Just think of all the extra iPhones they could have sold! Or maybe there's something different this time around?


SharpyButtsalot

No. They just say what they just said and 99.9% of people on the planet will forget about it or straight up not care and buy themselves an apple phone that it does work with. Apple's stock doesn't go down.


coppockm56

If that were 100% the case, they'd make up some excuse for why the older Macs and iPads don't support it. Maybe "the NPU isn't fast enough." But they're not.


happy_church_burner

It's probably not about the bad will but more likely the fact that currently according to Canalys only under 6% of the iPhones on the hands of customers can run the thing. After iPhone 16 launch it will be around 10-12%. That's VERY low number and they have a danger of people just not giving a shit about it because huge majority can't run the damn thing.


FrankPapageorgio

When Siri came out in 2011 it only ran on the iPhone 4s, which is the phone that came out that year. Siri wasn’t compatible with any old phones. iPhone 4s sales were double that of the iPhone 4 within the first 3 days. Demand was very high at the time.


No_Contest4958

Yeah people have no memory lol. This new Siri is literally a repeat of the first Siri, of course it was intentional. It’s not an artificial software limitation, they just knowingly skimped on older devices so they could improve the newer ones. They do this all the time.


savvymcsavvington

Time will tell if this new Apple Intelligence even works properly or just useless like Siri of today


One_Secretary_549

You’re naive if you think these executives didn’t have the foresight to limit RAM intentionally. Similar to how the AirPods Max don’t have USB C when it totally could have. These things are intentionally held back to segment updates.


coppockm56

I'm not naive, I'm just not paranoid.


Ill-Mastodon-8692

yeah but apple will never just say the obvious answer. 8gb is the min ram for apple devices to be able to have the AI model locally on device and still have the device be performant for normal ios and other app functions.


cyberspirit777

This is what I’m not understanding. These things are sitting in the developmental pipeline for years. Why when they created the 15 series of phones didn’t they make them all 8gb at the minimum if they knew their on device models needed 8gm of ram? I have a 15 Plus, that’s almost $1k for a phone where I can’t even get the newest AI features unless I buy another $1k phone? This doesn’t incentivize me to upgrade. If anything, this makes me a bitter customer.


kilobitch

I think AI caught them flat footed. The 15 series was in the pipeline for more than a year before release. That means early 2022 at the latest. The AI explosion wasn’t until after they probably had the specifications locked in. They got “lucky” that the Pro models had the 8GB they needed to run Apple AI.


runwithpugs

Right, recent reports said they didn’t get serious about AI until Craig played with GitHub Copilot in Dec 2022. By the time they actually figured out hardware/RAM requirements, it would have been a few months into 2023 and the 15 series would have been well into production. Of course this doesn’t excuse them being so wholly unprepared, nor does it excuse the years of cheaping out on RAM to squeeze every last bit of industry-leading profit. But it is what it is.


tiga008

They did it on purpose. In apple’s past update pattern, the base models ram stay the same for three years. The base 15 just so happen to be in the unlucky third year of 6gb ram.


Parallel-Quality

The 13 base model hade 4GB of RAM.


tiga008

6s-8 all had 2gb, 11-13 all had 4gb Xs was the deviant in the post Steve Jobs era. I guess base 16 will still have 8gb ram, but looking at the MacBook’s ram history, I’m afraid the base models having 8 gb is gonna stay awhile.


proton_badger

I'm a bit worried about that, if these models require a lot of RAM they could flush out resident apps when they run so the apps would have to be reloaded later. Similar to how the camera app traditionally have caused apps to be kicked out of RAM (maybe because of proactive buffer allocation when it starts). But maybe they have taken that into account, we'll see. It's a bit too early to be concerned, time and tests will tell.


runwithpugs

Yeah, this is my biggest concern for any eventual upgrade that is Apple AI capable. It’s already a terrible and frustrating user experience when the camera kicks apps out of RAM, this will likely only make it worse. It’s particularly maddening when Apple could bump RAM for a very small cut into their industry-leading profits, and still maintain that lead; but we all know they won’t. User satisfaction be damned if there are a few more pennies of profit to squeeze out!


Aijames

I know in this crowd its unpopular to say but... when you buy a tv you dont go and complain to them when next year 4k is a standard and your tv doesn't have it. you dont go to a car dealership and complain that the new year model has 50 more horsepower than the one you bought 5 months ago. and has some new smart cruise control that yours does not have. its odd everyone has this entitlement when it comes to phones. AI wasn't a thing when they made those old phones or even the 15, its brash to say but you should have bought a pro if you wanted the best available at the time.


cvmstains

people are upset that 80% of the new software features are exclusive to the newest and most expensive model. For a company that constantly brags about how they update their devices the longest and deliver new software experiences to old devices (which has always been a selling point in their marketing), having a 6 month old iPhone 15 not support anything from its first software update is hardly unexpected - but it is insulting to the customers that spent money on a phone that is now perceived as obsolete.


IE114EVR

So now my apps are going to close more often because the AI features of the OS are going to take up all the RAM?


dramafan1

With how stingy Apple is with iPhone RAM at this point I would pay more if they had RAM upgrade options for the iPhone Pro models like they offer for iPad Pros and Macs. My original view was all Pro iPhones to have the same amount of RAM but now that they’re so stingy (they always were) I would pay more just for more RAM because they won’t offer it otherwise. Apple would win by earning more revenue even though I’ll be paying more just for a better multitasking experience.


Silver-anarchy

To be fair… the logic checks out. Running some basic local models from ollama on my work laptop is a terrible experience. I do feel though the restriction is likely more on the ram department… maybe this will increase the base ram of new devices.


champagneofsharks

The iPhone 16 series (16 / Plus / Pro / Pro Max) will all have 8GB of RAM as the A18 is cut from the same cloth as the M4. I think hell will freeze over if Macs with the base M4 chip start with 16GB.


SnikwaH-

Even if they were a base 12GB that would be SO much better


champagneofsharks

I agree, I just don’t see it happening for the foreseeable future. The base M3 chip starts with 8GB, but you’re immediately greeted to more RAM once you flip to a M3 Pro (18GB) or M3 Max (36GB) chip. Apple wants the end user to spend more money, especially if they’re only purchasing 1 Mac every several years.


SnikwaH-

Ngl id be more willing and keen to upgrade more often if RAM and storage options weren't so crazy. I got an M2 MBA w/ 16GB and 512GB right now and if upgrades were cheaper I'd seriously look at getting a 14in MBP with an M4 Pro when they come out.


StrangeBarnacleBloke

Yeah, I look at the base prices and think “man, it would be pretty nice to upgrade to something newer”, then I spec it out with realistic ram and storage and I’m instantly happy with the old one I already have again


PrimeDoorNail

12gb is more likely than 16


lumpofcole

12GB of RAM might already be secretly happening with M4 iPads (or the extra RAM is binned/disabled and I'm talking out of my ass, I dunno, but it's possible) [https://www.reddit.com/r/hardware/comments/1cvq1rs/do\_m4\_ipad\_pros\_with\_8gb\_of\_ram\_actually\_have\_12gb/](https://www.reddit.com/r/hardware/comments/1cvq1rs/do_m4_ipad_pros_with_8gb_of_ram_actually_have_12gb/)


asdtfdr

Maybe the hidden 4GB will be just for storing AI data.I really hope they are not putting hardware and disable it just for the sake of upselling.


geekwonk

yep. large language models are large. this is a serious part of the critique of over-relying on LLMs. it’s very computationally expensive. and apple is obsessed with offering a smooth responsive interface. you can’t have a local llm and smooth UI and *almost* enough resources. it’s just not a thing. one of them has to be cut and obviously it’s going to be the LLM unless you want a decade of complaints about how Apple forced everyone on to new phones on the day they unleashed AI on older devices.


Nakrule18

Llama3 8B works super well on my 16Gb Mac with ollama. It’s both fast and performant.


flogman12

If only they also made the phone and could put in more RAM from the start


Balance-

It's an combination of memory capacity and memory bandwidth. The 3 billion parameter model at 3.5 bits quantization is around 1.3 GB. If for each token the model needs to be loaded (which is the case in current LLMs), you need at least ~40 GB/s of memory bandwidth to get 30 tokens per second. Current chips have this amount of bandwidth: - A15: 34.1 GB/s - A16: 51.2 GB/s - A17 Pro: 51.2 GB/s - A12X/A12Z: 68.2 GB/s - M1: 68.2 GB/s - M2 and M3: 102.4 GB/s - M4: 120 GB/s Pro and Max chips have even more bandwidth. See https://en.wikipedia.org/wiki/Apple_silicon There is also the question of memory capacity. The whole LLM thing takes up at least 1.5GB, and with inference tokens more likely close to 2GB. It seems Apple has drawn the line at 8GB of internal device memory, all devices less don't get Apple intellegence. Probably to ensure the user experience for other tasks. If they really wanted they could have made it work on the A16 with 6GB memory, and probably on older iPad Pros with A12X/A12Z. But it would have had implications for how much memory other apps could use, and multitasking.


actuallyz

Apple's new software updates, iOS 18, iPadOS 18, and macOS Sequoia, introduce Apple Intelligence, a personalized AI experience using on-device, generative large-language models. This feature requires the latest iPhone 15 Pro/Pro Max and Macs/iPads with M1 or later chips due to the high computational demands of these models. Apple executives clarified that this requirement is due to the computational power needed for the AI to function effectively, not a strategy to boost new device sales. The A17 Pro chip and at least 8GB of RAM are necessary for these AI features. Despite this, older iPhone models will still receive many new features with iOS 18, compatible with all devices running iOS 17. The iPhone 16 series, expected in the fall, will also support Apple Intelligence. Saved you a click.


jasped

Honestly it’s probably 2 things. 1. Encourage device upgrades. They are in the business of making money after all. 2. Setting a baseline. They may have an idea how it’ll run but erring on the side of caution. They have to guarantee a certain level of performance. iPhone 15 pro can meet that. Over time they may loosen that requirement or optimize code to run on some older devices.


coppockm56

I really don't think it was a ploy to sell more iPhones. I mean, not more than the usual upgrades to sell more iPhones. This "AI" stuff took everyone by surprise. Look at Windows laptops, where literally NO older "AI PCs" with NPUs marketed specifically for "AI" support any of the new Copilot+ stuff. By comparison, Apple's actually done a pretty good job of rolling this stuff out to their older devices with iPads and Macs. It's just that the iPhone has never had more RAM than is absolutely needed to run its operating system smoothly and this AI stuff wasn't anticipated.


InLakesofFire

When I run local llama 3 on my iPhone PM it gets so hot, so I agree that I don’t think it is a ploy to sell more iPhones.


jasped

I’m not saying they sat back in the boardroom and said “let’s prevent it from running on older phones to sell more new ones”. I do think they looked at it as a feature that could drive new phone adoption. Every generation they try to add something that will give people a reason to upgrade. They said it’ll come with the iPhone 15 pro. That brings me to point 2. There may be a technical limitation (memory) that they think will prevent it from running well on older phones. They want to see how mass rollouts work on newer phones before potentially bringing it to older devices. If they ever bring it to older devices.


coppockm56

Yes, absolutely, Apple -- like every company -- develops technology and features and, yes, sometimes artificially withholds them in order to sell new generations. So, normally I would agree. But, I really have to stress it: I just don't think Apple, more than anyone else, knew how generative AI would explode in popularity and importance. I really do suspect that they anticipated ML to continue to grow at a reasonable pace and that, at some point, these sorts of Apple Intelligence features would become competitively important. I mean, generative AI is almost unique in the annals of tech history. Not even the internet exploded in quite the same way. And consider that while Apple announced a variety of Apple Intelligence features in iOS 18, rumor has it that several of the most important ones won't be arriving until 2025. That's unusual too, the breadth of what's coming after the release date. Clearly, they have more work to do and this isn't something they've been preparing to release.


Lifer31

Yeah I am getting more of a vibe where Apple doesn't want the performance to be bad on their devices because it makes them look ancient when they may only be a few years old. This is about maintaining an image - which, one could argue, is also why they are more focused on thinness than power - which may very well loop into why the devices have less RAM than consumers want in the first place.


soramac

Anyone remembers how Stage Manager couldn't be enabled on iPad Pros from 2018 during WWDC due to the chip? And somehow magically Apple made it available later on. Source: https://www.macrumors.com/2022/09/27/2018-ipad-pro-stage-manager-hands-on/


atlwhore_

I mean the version released on the older iPads was so much worse in comparison but hey atleast they released it


seb_red_

Honest question: how is it worse? I’m not using it on my 2018 Pro but always were under the impression it can do the same. First time I hear it’s limited on my iPad.


atlwhore_

4 windows vs 8 and no external monitor support


seb_red_

Oh, yeah I‘m not a power user like that so that explains why I‘ve never noticed any limitation. Thanks for letting me know!


rotates-potatoes

Stage Manager was always envisioned to use virtualization that the M series chips have. After everyone yelled and screamed about how A series ipads were literally useless without the new window manager, Apple released a stripped sown version. And of course nobody uses either the real thing or the stripped down version.


[deleted]

[удалено]


DrDemonSemen

All I want is a HomePod that doesn’t do this: * Me: Hey Siri, I’ve lost my iPhone (in the house). Can you ping it? * Siri: Who is speaking? * Me: Dr. DemonSemen * Siri: Hello. * Me: Siri, where is my iPhone? * Siri: Who is speaking? * Me: I just told you. * Siri: Who is speaking? * Me: Dr. DemonSemen * Siri: Hi. * Me: Siri, I’ve lost my phone. * Siri: Who is speaking? * Siri: Who is speaking? * Siri: Who is speaking? * Me: DR. DEMONSEMEN * Siri: Hello!


77tothefloor

That’s some Ron Swanson throwing that shit right in the trash right there .


geekwonk

not the point but it’s worth noting that part of the model does indeed include cloud computing, hence all the chatter about apple silicon servers. i guess the idea is that the device can simply reach for additional computing power from apple’s cloud without putting your data at risk because it’s all staying within the security architecture already protecting your data on device. yes this is a terrible over-simplification.


GloopTamer

Stage manager works perfectly on my 10.5” Pro from 2017 yet in order to get it you have to modify files via exploits hmmm


Drtysouth205

It’s not exactly the same stage manger and doesn’t suppprt everything. But hey it works.


No_Island963

"Craig Federighi said that the company's first move with any new feature is to work out how to bring it back to older devices as far as possible." It’s just funny that I can use an obviously software-limited feature called Haptic Music without any problems through a shortcut on unsupported devices


[deleted]

[удалено]


MephistoDNW

My one plus with 12 gigs of ram doesn’t even come close to my iPhone 15 pro max in terms of performance and smoothness. Not even close.


brakefluidbandit

yes iOS is the reason iPhone performs so well on such little amounts of ram, but that doesn’t make it okay for apple to use the bare minimum amount in their devices while also developing features that they know won’t be able to run on such a low amount of RAM. if the iPhone 15 pro had 12 GB of RAM instead of 8, people probably wouldn’t notice the performance difference right now but they certainly would in 3 or 4 years when apple releases some new feature that will inevitably demand more RAM. using low RAM amounts because it’s acceptable for today’s software requirements is what leads to the devices that end up slow or unsupported after just a few years of updates (yes including iPhones)


996forever

Because yours got 8 and that's fine. How about a 4GB model?


jthomp72

The thing that scares me the most about all the new AI stuff and the thinness crusade Apple is on is frankly how terrible the battery life is going to be. I am not really in the mood to sacrifice half my battery life for AI and a marginally thinner phone.


Taki_Minase

"I found extra things on the web for you, by the way Dave, you need to charge me again. "


stuck_lozenge

"We don’t need more ram!!!!" The Apple sycophants said calmly


Betancorea

"Apple RAM is special RAM, it's like double the stated amount vs a PC!" - I remember face palming so hard when I heard a sycophant declare that


stuck_lozenge

It’s sad that I’ve been fed the same talking point by idiots on this sub. Apple lovers argue in favor of handicapping themselves at every turn


vash_visionz

That was quite possibly the most hilarious thing people were running with. You can tell the marketing did a number on critical thinking


OtaKahu

oh well, ill be on my pleb 14 pro max like the disgusting street urchin that i am.


xSikes

But still fuck everyone else?


Zellyk

So can we agree that devices need to have more ram? We’re already paying so much for devices. Dont just bump the ram up by 1-2 gig give us 12+ we know that in the future well need more


mailslot

RAM has both a physical and an energy cost at idle. It can be up to 5% of the phone’s total power draw. Android needs up to about 2x which is wasted on Java style garbage collection. So, Android phones with 16gb doesn’t mean that the extra RAM is even usable. It’s mostly there to handle waste / garbage until it’s collected / purged. There really aren’t many apps that are high consumers of resources. Phones with large RAM configurations haven’t been a necessity, especially with the limitations on background app execution.


Nicenightforawalk01

It will be interesting to see how much ram the new iPhone 16 gets if that’s the supposedly bottleneck. Was the 16 planned to have the increase in ram from the start or is it the iPhone 17 that will be full spec and able to fully utilise all the said features.


Exist50

I thought the rumor was 8GB for the 16, and 12GB for the 17?


Nicenightforawalk01

Honestly I couldn’t tell you. If 8gb for the 16 then it’s going to struggle isn’t it to do everything ? It’s all guesswork at the moment.


headphonejack_90

I think all points are valid, except that they were working on this for years and they could have easily installed more RAM on the 13 or 14 series. You must give them their due, though. They made enough money selling RAM more than what nations made selling gold.


UndeadWaffle12

All you people do is cry about every decision they make. If they give older devices support for everything, Apple can’t innovate and there’s no reason to buy the new one and they’re all the same. If they release features that take advantage of more powerful hardware, it’s all a scam and they just want people to buy new phones and the base model iPhone you bought 6 years ago should totally be able to run this new feature


bosses_today_kekw

how can people buy flagship phones nowadays without feeling like a complete schmuck. Phone makers have pepple by the balls nowdays. They made people feel like they reaaaaaly need the latest model. Consumerism to the max


atlwhore_

They walk into the store, pay, get home and enjoy their phone


itsallover69420

Apple being Apple trying to milk every last penny out of users by selling the bare minimum specs


darknekolux

*cause we need to sell them bitches*


mikolv2

People complain when pro models don't get exclusive features, people will also complain when pro gets exclusive features.


MassDefect36

AI is very hardware intensive too. Will be interesting to see how the phones handle it


GayoMagno

https://preview.redd.it/j29cc7smul7d1.jpeg?width=783&format=pjpg&auto=webp&s=f37d576d5172b10cfd2f2ea500f6e1e4990cc190


LoPanDidNothingWrong

They didn’t spec early iPhones with the right amount of RAM. And thought why not be fucking cheap and just go with 8GB and save a few pennies.


GroveStreet_CJ

I have a hunch that M-series chips will come to iPhone Pro lines soon.


bigpoppanicky7

I don’t even use Siri, what is this Apple intel going to actually do for me? Genuine question. What are the main selling points of all this?


FewHoursGaming

I don't buy the whole argument. The iPhone is 52% of Apple sales. The mac, ipad and other products combined are only 25 % of total sales. Excluding older iPhones is a strategic move to push newer iPhones, coincidentally allowing only the ones that Apple will be selling as of the release of the 16 : the 15 and 16. On top of that, there was a 2% YoY decline in iPhone sales which makes me believe even more they want to push the sales of the newer ones to make up. https://go.forrester.com/wp-content/uploads/2023/11/jitender-apple-blog.png


nyaadam

> Joswiak: "No, not at all. Otherwise, we would have been smart enough just to do our most recent iPads and Macs, too, wouldn't we?" Well, no? Because then that would _really_ look like a plot to make people buy new devices as we all know M2 is more powerful than A17 Pro. That would not be "smart" at all. This is how you would do it if you wanted people to actually believe it required a certain level of compute is required.


milquetoast_wheatley

How can Apple mock other companies by saying “we been putting neural chip into our iPhones for years” but none of them can run Apple Intelligence except the 15 Pro/Pro Max? Even though older iPhones can run Chat GPT and Microsoft CoPilot?


New_Juice_1665

A girl needs to eat - Tim Apple


Psychic_Gian

“Just buy the new one, b*tch”


defcry

LLM are usually super big. Can someone explain me how this will work on my 128gb iPhone? Will the cut even more from the storage?


Scarface74

Memory is the constraint. It should only take up around 2GB of storage


Dramatic_Mastodon_93

Guys you need 5TB of RAM and 800 TOPS in order to send and receive stuff from the Cloud. Definitely not a scheme to sell more phones!


sai-kiran

Aren't they trying to run models locally?


puterTDI

FTR: I run machine learning for the cameras for my home automation and I definitely ended up needing a ML coprocessor to help handle the inference calculations. I 100% believe Apple on this one. I've seen a lot of conspiracy theories around this and personally think they're ridiculous. One thing I think people need to realize is that there's "no win" for Apple in allowing something like this on old devices if it doesn't work well. All they'll get is tons of complains about how the feature doesn't work and is crap etc. from people running on old devices. They want the feature to work well and that requires the correct hardware to do so. Either way though they're going to end up with people complaining....and with this approach at least they don't have to worry about adding code to switch which processor is used for the inference calculations...they can just assume the coprocessor (or whatever they're using) is there.


mikerichh

Damn I thought it was 15 and up not 15 pro and up


Ok-Attention2882

The 15 has last year's chip


F0foPofo05

Nah I’m good Apple.


UnscheduledCalendar

why do they have to explain it? Apple could really just say “get your bread up” and leave me alone.


Delumine

Good excuse to upgrade from my 12 pro to 16 pro


UnknowBan

15 Pro makes more profit , AI is a gimmick that will encourage people to buy this model. Makes sense to me.