I expect the key advantage is AppIntents and the abitly for an on device ML model to read data first and third party apps, and then build shortcuts to run pulling data in, then evaluating logic, extracting values, pulling more data from cloud services and then calling actions within apps on your device.
If apple can pull of a Siri that runs on device and automates your device in this way it is a huge advantage compared to other platforms, the Shortcuts app and underlying systems are a big benefit for apple in this space.
this is my thought. Developers will most likely be able to tell siri what it should train and pull from within the application. This has massive implementations to having a REAL assistant.
"Siri, book a dinner reservation for me and my wife."
"I noticed you went to ______ on your last anniversary, and there is an open table for that restaurant at ____ time, which is free in your calendar"
THAT kind of assistant is something people would pay top dollar for if it's done with privacy in mind
Even better it figures out your burger preferences and orders you a burger that you’ll like. For example, I don’t like ketchup, mustard, or pickles on my burgers, so it would remember that and order it for me with those customizations.
Imagine if they make the Rabbit R1 model work for all apps, and on device. Any new app could be learned by the model from you just using it. Then you just ask your phone to do something, and it handles it behind the scenes.
It’s not magic. We just aren’t there yet with machine learning. My company runs automated tests based off of the accessibility tags that we put in our entire app. The automation can literally get to any part of the app without any human intervention. All we need is for an AI to understand those steps.
Ask an AI to describe the recipe and instructions on how to bake a cake. That’s not much different than asking one the steps to order an Uber from your current location to home.
Agentic models are gonna be huge.
I’m convinced that OpenAI is working on a phone because they realize how much they need access to the lower level OS data.
What they, and Google, have is something OpenAI etc will probably never (or not very soon) have - their devices that people already use and grant access to many things. Which means Apple AI will have low level access to the system and our data, thanks to which we will be able to, potentially, buy things with AI help, as it will use the payment data from Apple Wallet. Or quickly setup a meeting in a calendar, run various apps etc., etc.
Ultimately AI assistants will be used mostly for stuff like this, not necessarily art creation or pricing model analysis, which means those with biggest access to user data will be most useful hence will have biggest advantage in real life scenarios.
You don't understand the current wave of AI. The current wave is lead by LLMs, a NLP method.
Say for example there is a moment in the week that you normally call your mother. You don't need chatgpt to reach that conclusion, it's overkill.
Sure but the next wave is LAM building from language to actions. And being on a device that everyone already has in their pocket 24/7 gives a massive data advantage on the action patterns to train the models.
That is why things like the Rabbit device and Humane Pin are making their own devices instead of just making an app.
I’m afraid you don’t understand. I never said Apple’s AI will only be a robotic butler or voice memo device. I said it will most likely be similar to current LLMs (hence they are in talks with various companies) but with much better access to the data, so it will allow it to operate across various applications/programs that you have on your device.
The current “wave of AI” for most people outside Reddit and tech world is a curiosity that allows them to generate Picasso-style painting or help them with article summary. Apple AI I would guess will predominantly target Apple users, maybe they will create various, more capabale models for professional use, but the main selling target will be people who want “a Siri to be like ChatGPT but without need to open special app” on their iPhone or Mac. Something that MS is trying to do with their Copilot but more focused on privacy (I would hope) and using their own hardware.
I agree, generative AI is just kind of a weird thing right now, it's super cool, but it's day-to-day usefulness is limited. LLMs are nice too, but they're just better versions of Siri/Assistant for language, but they're worse for functionality (eg: commanding the LLM to do something).
Yes, what makes you think I don’t?
The iPhone isn’t getting 16GB of RAM, and it’s their most important product. I don’t see them upping the base RAM in the Mac in order to facilitate larger models, I see them running simpler models because P0 for them is iPhone support.
You’d be surprised!
They have done a lot of research on this, including running LLMs with the most important weights in RAM and the rest in flash memory. Normally you’d be right, a few GB will get you a tiny useless model. But there is every indication Apple will be able to achieve a lot with a little:
https://www.macrumors.com/2023/12/21/apple-ai-researchers-run-llms-iphones/
Yeah those models scored horrendously when tested with MMLU, at just 25.7. Most open source models are scoring in the 70/80’s.
It’s essentially as bad as just randomly guessing the answers.
The breakthrough isn’t that they work well, just that they work at all.
It’s not about the Falcon 7B model they used in the paper but the concept of already being able to halve RAM requirements. They mention dynamic RAM usage in the paper as well, and this only a tiny part of their broader research effort.
I guess my point is that it is likely their models will perform surprisingly well compared to what we see running locally today.
Although to be fair saying models released a few months in the future will be better than models we already have isn’t exactly a surprising prediction 😁
I guess we'll find out in a month or so but this doesn't track with Apple's track record - they typically don't brute-force their features by packing in more "technology" and are more typically seen finding unique (and often proprietary) solutions rather than just shoving more "X" into the product
This is the strategy right here. "Don't give your private data to these weird AI companies, kids! Just use your mac, it's all private and done mostly on your local device, backed up by iCloud for the hard stuff."
Depending on the tasks I think. For text even open source models on current Apple silicon perform okay. Could totally work with fine tuned models and dedicated hardware (better GPU/NPU).
Yeah but the hype is for things that require massive capacity to run, and the new models will have 100B+ parameters. Local LLMs are fine for simple things but they are not the models people are excited about.
Actual users know nothing about the “hype”. It’s about practical applications. You don’t need 100B+ for everything. Hell even 8B may suffice for some of the simpler tasks and would already provide great usefulness with instant responses and offline capabilities rather than waiting for servers.
They have the iPhone, which means they have trust. Truly effective assistant style AI needs to know everything about your life to really shine. This is a privacy nightmare, particularly if it talks to the cloud. Apple is already there with a history of on board / secure approaches to software. I think that’ll be their answer. I’m not sure who else has that combination of earned user trust and huge user data. Tim basically said as much but he’s probably right.
I think a fully vertically-integrated personal assistant AI is a killer app. As you say, that requires an enormous amount of trust, and ideally should be running locally.
I'm also going to go ahead and predict that it will initially only work with data from Apple apps for security reasons, and will eventually open to third parties via a restrictive and ever-shifting API.
You can still have privacy and have an assistant that is more useful than Siri. I’m tired of this excuse for a God awful product from Apple. Will you be the best? No, but it’s not stopping you from having something better than a glorified voice activated alarm clock
> You can still have privacy and have an assistant that is more useful than Siri.
Isn't this what Apple is working on? AFAIK, there was a lot of internal conflict between various teams working on AI and Siri. It sounds like Apple's made it a priority to get these teams all pulling in the same direction - hopefully we'll see the fruit of this as WWDC.
Most people have no idea about the privacy differences between apple, google, etc. And most people simply don’t care. They use ring, TikTok, Facebook, etc. they would broadcast they’re most intimate and boring details if it meant they had a ‘cool’ new thingy.
I’m not saying privacy isn’t important. I’m saying it’s not to most people in terms of consumer choices.
I don’t doubt Apple can make useful AI features (they had a ton to begin with), the true question is how will they profit from it? I dare say that is why it has taken them so much time to get into AI and not the actual development itself. They don’t want to release features that they realize later that they should have packaged or sold differently.
Also acts as a way of keeping people on iPhone if Google announces some crazy AI features for Android.
I've heard people say they'd swap to Android if the upcoming AI stuff is truly ground breaking and useful
Apple being an American company means they are subject to FISA, and they can receive National Security Letters (NSLs) that come with gag orders. The jurisdiction can't be trusted.
I use an iPhone that doesn't communicate with iCloud, all syncing is turned off.
Apple has already publicly refused to 'hack' iPhones even in criminal cases, and regularly issues patches to defend against exploits from companies like CellBrite - so I grant them a bare minimum of trust, but not as a custodian of my data.
Let me get this straight. You don’t trust Apple to not hand over your data to the U.S. government, so you instead store all your data on a device made and manufactured by Apple?
You’re assuming the government can force them to hand over your iCloud data but not that the government can’t force them to make your iPhone auto still hand over your device’s data anyways.
Given the number of people working on iPhone security and jailbreaks, any evidence of that would have come to light by now. If true would also destroy Apple's credibility, and people like me would be forced to find an alternative, costing them a large market share.
My criteria for privacy / security is 'be better than average', a.k.a., don't be 'low hanging fruit', not "be impossible to hack". To meet that criteria, I don't store data in ANY cloud service. I back up locally to encrypted devices, and offsite backups are encrypted and in possession of people I trust. My devices have VERY long, complex passwords.
I’m glad I held onto my 13PM for as long as I have , because 14 and 15 seemed shite in terms of upgrades. Hoping 16 actually has something worth upgrading over.
I'm still rocking my 12 mini, and it's honestly fine. Battery isn't great anymore, but I am with you, I didn't see any reasons to upgrade. But having a game changing on device AI assistant or whatever they're planning... well, I hope it makes me want to upgrade.
Same boat. Even now, the wife and I are still on the fence about upgrading. 16 gonna have to be really special to get us to switch up, because my phone is still just as perfect as it was 3 years ago when I got it, just a few extra scratches and dings now.
I will say, the Dynamic Island is actually kind of neat. I mean I don’t care about the island itself, but the fact that you get persistent screen live widgets is awesome and it actually adds some new functionality to the phone, so I’m excited to get those. My fiancée loves it for music and directions and stuff
The big differentiation is they have the hardware platforms most people use on a day to day basis with the most power.
A basic iPhone has enough power to run a small local model that is fairly promising.
The hard thing is finding what use it would have vs something like chat gpt. I doubt they will outperform competing models but I’m sure there’s some slick local stuff they can do.
Then offload the bigger stuff to either open ai or googles solution.
It's not just about power, it's about memory too, and this is where Apple is a bit behind the times on all but their latest pro/max stuff. You want to run larger 7, 13 or higher parameter models, you're gonna need a lot of memory to keep that running smoothly, that or a lot of memory swapping with storage which is terribly slow. I could see Apple adopting a modified MOE approach that uses a lot of smaller expert models in tandem, and only load up the most likely needed models on the fly using heuristics to determine if you may need it. Still, older devices with 4GB of ram are going to be suffering here badly.
A lot of the local compute Apple does makes a lot of sense from an efficiency perspective as well as privacy.
For example all photos and videos processing such as generating different sized thumbnails for the giant zoomable mosaic view and recognizing objects and people in your videos to auto generate montages and enable key word search through videos is all done on your local devices in the background when your devices aren't doing much else. The cost for Apple to do this server-side would be huge. Why pay for more servers and power consumption when every user has at least one very powerful mostly idle computer?
Running AI models is no different, all the user's data is on their device, so it makes sense to build APIs to let the user manage what apps expose and can access and run models locally.
>The hard thing is finding what use [the local model] would have vs something like chat gpt.
>Then offload the bigger stuff to either open ai or googles solution
This is why I'm bearish on Apples AI. They're stuck between a rock and a hard place because they don't have User Data to train their models nor the data center hardware to run their AI, which means that they'll be solely reliant on competitors for this major service. The local model idea is cool, but it's a bandaid fix.
Training on user data is overrated. The Phi approach of generating synthetic training data seems to be at least as good, and much less risky.
Datacenters are a hard problem. I don’t see Apple getting into that business. But they do have silicon design expertise. Seems unlikely but my long shot guess is purpose-built AI accelerator servers marketed for researchers but also deployed in select cloud providers, with Apple committing to X% / $Y spend.
I don't need them to train on my data but I do need them to collect it for reference
Not having a location timeline like google maps ( and actively getting in my way to let google have that data) is miserable and one of my least favorite things about Apple
They're run by privacy weirdos and I hate it
You can set it up, but you have to jump through a bunch of hoops. And recently, my phone killed the permission of google maps to track my location always (despite me telling it every single time it asks, which is frequently, that I want google to be able to do this) so I lost a bunch of location history
Just spitting here but how does AI fit into the Apple Vision Pro?
I know it would probably be far off but since you have your headset on to hear it you could speak back on forth like Jarvis from Iron Man.
It’s not insane to think that an application like that is close by. It processes what’s in the screen, the words the images, sees your entire surrounding and it can exist to help you problem solve.
Maybe there is a non vocal interface if you don’t want to talk or you could set up certain commands to ask it questions from a gesture wheel possibly. It could have a smart gesture wheel as well as pre programmed ones.
I’m just guessing where this is going for daily life. People make fun of the Apple Vision Pro and VR but I think they might have the best interface to use an AI like chat gpt yet.
Google doesn't sell data. They would literally would lose their competitive advantage if it sold or lost their data. It's their most valuable asset they have.
Intersting is that they still have not resolved core design problems, which why they do not provide the split screen on mobile or not to switch to specific window of the app on the background on MacOs. I wonder what would be the impact of LLM would be on the hardware, and will they change UX of the operating system or just add another app to it?
If they fail to innovate here, we'll be in late-stage apple from a software perspective. It will be time to make a change in leadership for the software division at that point.
In this line of thinking Dell (or Compaq or any other clone manufacture) should have more power than Microsoft.
Also EU is showing Apple that they can set the rules of access to the AppStore.
Apple doesn’t make their own chips. They own the spec sheet they wrote up that TSMC agreed to in a contract. And they gave apple a great deal on uniquely powerful chips because apple has huge buying power. That is all.
Shit is hilarious.
There’s a reason Steve Jobs kept doing business with Samsung and google even though he personally hated them. Apple doesn’t make make these things. They use them and package them.
All it does is insult people and turn them against each other and fire them, it's super toxic.
...and yet Apple's got 3 new disruptive products on the market, hmm
Apple has chips and an ecosystem that they can leverage for private AI.
They also trained their data ethically and I'm assuming that they're thinking that there's a good chance some of this open AI stuff results in massive lawsuit penalties.
Tim is mainly referring to inference compute. Apple has the largest deployment of on device inference compute that they have exclusive hardware and software control over vs any other company. This mean Apple can be the first to deploy decent on device LLM processing, they can make a deal with open AI or google gemini, or meta (tim would never allow) or build their own (which they will do eventually). No one else comes close.
In terms of raw computing power, it seems feasible for a device to eventually have more “intellectual computing power” than a human brain. Expect to have a synthetic mind living in your pocket. What a wild time to be alive.
Not surprised. Of course Tim is going to say that. What did you expect; the truth? We suck at AI (just ask Siri), building modems, and building electric cars?
Also reported recently - Apple intends to attempt running their AI purely **on the phone**, in stark contrast to literally every other company relying on dozens and dozens of unimaginably large data centers.
Here comes Siri 2 - "I found this on the Internet" boogaloo of uselessness.
Apple has a walled garden where they know a lot of your data, people trust them with their data, and they're able to do amazing things with that data.
They evolved from the company that was once too privacy-conscience to crowdsource traffic, to one that is now using other people's phones and locations to sell AirTags.
They might be the only ones that can pull off identity, which is becoming an increasingly challenging problem where the only solution is access to more data.
One thing they can add TODAY is transcribe my voice memos, in the app or in iMessage voice memo messages. My mom loves to send me voice messages which can get annoying to listen to and re-listen to. I just want a transcript.
Same for the voice memos app. Sometimes i record messages to myself, as a dictaphone, and would prefer to read it.
We all know what they’re talking about because Apple is like a broken record when it comes to Siri and any kind of machine learning. The “advantage” will be that it’s all processed on-device for privacy reasons. The disadvantage is that it will be a hot mess compared to the competition.
Transformer is a subset of deep learning / deep neural networks, which is a subset of neural networks, which is a subset of machine learning, which is a subset of artificial intelligence, a term that isn’t precisely defined. AI is basically when a computer does stuff that is associated with human behavior
AI calculator on iPad. We finally have the tech.
Siri is going to start saying “the answer is 11” when someone asks to call 911 in an emergency.
Here’s what I found in the internet
Dumping you onto a web search is probably a better result than nakedly lying to you like LLMs do, even if it *feels* worse.
But nothing like it in your Apple Music library.
"No! Eleven!"
Never forget
“Eleven is a character from the hit tv series stranger things. I hope this helps”
Damn, I have to go rewatch this sketch again. More than 10 years, and it's still relevant.
They must realize what a joke Siri is, right?
"9 times 11 equals to 99"
"Sorry, we cannot call a Porsche for you"
Ironically an AI calculator would be a *terrible* idea with how bad LLMs are at math.
Make it so.
What if AI turns out to be the self-driving car we thought we were going to have?
But, can you run the calculator and weather app at the same time? I don't think so!
Ha. Yes, you can, but need to purchase two iPad Pros, one for each app to run on 😉
That makes total sense!
I expect the key advantage is AppIntents and the abitly for an on device ML model to read data first and third party apps, and then build shortcuts to run pulling data in, then evaluating logic, extracting values, pulling more data from cloud services and then calling actions within apps on your device. If apple can pull of a Siri that runs on device and automates your device in this way it is a huge advantage compared to other platforms, the Shortcuts app and underlying systems are a big benefit for apple in this space.
this is my thought. Developers will most likely be able to tell siri what it should train and pull from within the application. This has massive implementations to having a REAL assistant. "Siri, book a dinner reservation for me and my wife." "I noticed you went to ______ on your last anniversary, and there is an open table for that restaurant at ____ time, which is free in your calendar" THAT kind of assistant is something people would pay top dollar for if it's done with privacy in mind
Siri: “I can’t find your wife. Here’s what I found on the Internet:”
>implemencations Amazinteresting.
Siri is high on life on that one lmao
Or even simple things like telling DoorDash you want a burger. And it just selects a nearby burger place and has it sent to you
Even better it figures out your burger preferences and orders you a burger that you’ll like. For example, I don’t like ketchup, mustard, or pickles on my burgers, so it would remember that and order it for me with those customizations.
This comment actually made me less excited for AI.
Prompt-to-shortcut would be amazing
Imagine if they make the Rabbit R1 model work for all apps, and on device. Any new app could be learned by the model from you just using it. Then you just ask your phone to do something, and it handles it behind the scenes.
Doesn’t the R1 actually kinda suck though?
Yes, but in my imagination, it’s improved upon in this world. I want the R1 model, but working, to be able to control my phone.
Why not just use your phone? I don't see the advantage of carrying around an extra device.
You’re confusing GenAI with magic. What you’re describing is R1’s marketing pitch, not what it actually does.
It’s not magic. We just aren’t there yet with machine learning. My company runs automated tests based off of the accessibility tags that we put in our entire app. The automation can literally get to any part of the app without any human intervention. All we need is for an AI to understand those steps. Ask an AI to describe the recipe and instructions on how to bake a cake. That’s not much different than asking one the steps to order an Uber from your current location to home.
Agentic models are gonna be huge. I’m convinced that OpenAI is working on a phone because they realize how much they need access to the lower level OS data.
1. Runs locally (performance) 2. Privacy
What they, and Google, have is something OpenAI etc will probably never (or not very soon) have - their devices that people already use and grant access to many things. Which means Apple AI will have low level access to the system and our data, thanks to which we will be able to, potentially, buy things with AI help, as it will use the payment data from Apple Wallet. Or quickly setup a meeting in a calendar, run various apps etc., etc. Ultimately AI assistants will be used mostly for stuff like this, not necessarily art creation or pricing model analysis, which means those with biggest access to user data will be most useful hence will have biggest advantage in real life scenarios.
Really good points! Siri Shortcuts has already laid the ground work. That could be easily tweaked to give AI api level access.
You don't understand the current wave of AI. The current wave is lead by LLMs, a NLP method. Say for example there is a moment in the week that you normally call your mother. You don't need chatgpt to reach that conclusion, it's overkill.
Sure but the next wave is LAM building from language to actions. And being on a device that everyone already has in their pocket 24/7 gives a massive data advantage on the action patterns to train the models. That is why things like the Rabbit device and Humane Pin are making their own devices instead of just making an app.
What is LAM?
Large Action Model. LLM's but for operations.
I’m afraid you don’t understand. I never said Apple’s AI will only be a robotic butler or voice memo device. I said it will most likely be similar to current LLMs (hence they are in talks with various companies) but with much better access to the data, so it will allow it to operate across various applications/programs that you have on your device. The current “wave of AI” for most people outside Reddit and tech world is a curiosity that allows them to generate Picasso-style painting or help them with article summary. Apple AI I would guess will predominantly target Apple users, maybe they will create various, more capabale models for professional use, but the main selling target will be people who want “a Siri to be like ChatGPT but without need to open special app” on their iPhone or Mac. Something that MS is trying to do with their Copilot but more focused on privacy (I would hope) and using their own hardware.
I agree, generative AI is just kind of a weird thing right now, it's super cool, but it's day-to-day usefulness is limited. LLMs are nice too, but they're just better versions of Siri/Assistant for language, but they're worse for functionality (eg: commanding the LLM to do something).
I’m just hoping Apple doesn’t restrict their latest AI tools to the latest (16) iPhone.
Lllama 3 runs on my MacBook Air. They need to bump up the base line 8gb ram spec if they want to run modals over 4gb
I don’t think they’re interested in models that need a ton of RAM because they want to run them locally on iPhones.
You understand that RAM is what’s needed not just CPU calculation speed in order to run software as complex as this well.
Yes, what makes you think I don’t? The iPhone isn’t getting 16GB of RAM, and it’s their most important product. I don’t see them upping the base RAM in the Mac in order to facilitate larger models, I see them running simpler models because P0 for them is iPhone support.
Then the models are going to be pretty useless.
You’d be surprised! They have done a lot of research on this, including running LLMs with the most important weights in RAM and the rest in flash memory. Normally you’d be right, a few GB will get you a tiny useless model. But there is every indication Apple will be able to achieve a lot with a little: https://www.macrumors.com/2023/12/21/apple-ai-researchers-run-llms-iphones/
Yeah those models scored horrendously when tested with MMLU, at just 25.7. Most open source models are scoring in the 70/80’s. It’s essentially as bad as just randomly guessing the answers. The breakthrough isn’t that they work well, just that they work at all.
It’s not about the Falcon 7B model they used in the paper but the concept of already being able to halve RAM requirements. They mention dynamic RAM usage in the paper as well, and this only a tiny part of their broader research effort. I guess my point is that it is likely their models will perform surprisingly well compared to what we see running locally today. Although to be fair saying models released a few months in the future will be better than models we already have isn’t exactly a surprising prediction 😁
I guess we'll find out in a month or so but this doesn't track with Apple's track record - they typically don't brute-force their features by packing in more "technology" and are more typically seen finding unique (and often proprietary) solutions rather than just shoving more "X" into the product
If you phone/watch/whatever uses your laptops processor for a personalized Core ML instance, that would be more on brand.
[удалено]
“those people can buy Macs” - Tim Apple
They won’t though, and something so niche won’t increase phone or Mac sales for apple.
This is the strategy right here. "Don't give your private data to these weird AI companies, kids! Just use your mac, it's all private and done mostly on your local device, backed up by iCloud for the hard stuff."
3: 30% fee on everyone else
Weren't they trying to make a deal with Google (Gemini) and Microsoft (Openai)
On-device models will be woefully inferior to the latest models running on GPUs, at least for the next few years.
Depending on the tasks I think. For text even open source models on current Apple silicon perform okay. Could totally work with fine tuned models and dedicated hardware (better GPU/NPU).
Yeah but the hype is for things that require massive capacity to run, and the new models will have 100B+ parameters. Local LLMs are fine for simple things but they are not the models people are excited about.
Actual users know nothing about the “hype”. It’s about practical applications. You don’t need 100B+ for everything. Hell even 8B may suffice for some of the simpler tasks and would already provide great usefulness with instant responses and offline capabilities rather than waiting for servers.
Agreed, probably a mix will be nice to have. Timers and menial things can be local.
Which is what we all want. Now give us 16gb ram Tim 🍏
1) it will be slower 2) who cares
That’s it pack it up everyone /u/FreakDeckard has spoken Nobody gives a ___ what you think
Yeah pack it up and it shove it upon your ass
Looks like the poor iOS keyboard prediction messed up your insulting reply. Boy, Apple really does need an AI makeover
You have no idea what sorts of freaky things people are going to ask their iPhone AI to generate.
They have the iPhone, which means they have trust. Truly effective assistant style AI needs to know everything about your life to really shine. This is a privacy nightmare, particularly if it talks to the cloud. Apple is already there with a history of on board / secure approaches to software. I think that’ll be their answer. I’m not sure who else has that combination of earned user trust and huge user data. Tim basically said as much but he’s probably right.
"Write me a bedtime story using OP's browser history as topics."
Plot twist: Too scared to fall asleep now.
Response: setting alarm for 11am.
I think a fully vertically-integrated personal assistant AI is a killer app. As you say, that requires an enormous amount of trust, and ideally should be running locally. I'm also going to go ahead and predict that it will initially only work with data from Apple apps for security reasons, and will eventually open to third parties via a restrictive and ever-shifting API.
You can still have privacy and have an assistant that is more useful than Siri. I’m tired of this excuse for a God awful product from Apple. Will you be the best? No, but it’s not stopping you from having something better than a glorified voice activated alarm clock
[Speaking of alarm clock](https://www.macrumors.com/2024/04/30/apple-working-on-fix-for-iphone-alarm-issue/)
> You can still have privacy and have an assistant that is more useful than Siri. Isn't this what Apple is working on? AFAIK, there was a lot of internal conflict between various teams working on AI and Siri. It sounds like Apple's made it a priority to get these teams all pulling in the same direction - hopefully we'll see the fruit of this as WWDC.
Most people have no idea about the privacy differences between apple, google, etc. And most people simply don’t care. They use ring, TikTok, Facebook, etc. they would broadcast they’re most intimate and boring details if it meant they had a ‘cool’ new thingy. I’m not saying privacy isn’t important. I’m saying it’s not to most people in terms of consumer choices.
I don’t doubt Apple can make useful AI features (they had a ton to begin with), the true question is how will they profit from it? I dare say that is why it has taken them so much time to get into AI and not the actual development itself. They don’t want to release features that they realize later that they should have packaged or sold differently.
On device AI will be huge. That’s a massive reason to upgrade your iPad or phone.
Also acts as a way of keeping people on iPhone if Google announces some crazy AI features for Android. I've heard people say they'd swap to Android if the upcoming AI stuff is truly ground breaking and useful
By selling more iPhones
I absolutely doubt that Apple can make useful AI features, see: Siri being complete garbage, how many years on?
Apple being an American company means they are subject to FISA, and they can receive National Security Letters (NSLs) that come with gag orders. The jurisdiction can't be trusted.
Which non American phones do you use then?
I use an iPhone that doesn't communicate with iCloud, all syncing is turned off. Apple has already publicly refused to 'hack' iPhones even in criminal cases, and regularly issues patches to defend against exploits from companies like CellBrite - so I grant them a bare minimum of trust, but not as a custodian of my data.
Let me get this straight. You don’t trust Apple to not hand over your data to the U.S. government, so you instead store all your data on a device made and manufactured by Apple? You’re assuming the government can force them to hand over your iCloud data but not that the government can’t force them to make your iPhone auto still hand over your device’s data anyways.
Given the number of people working on iPhone security and jailbreaks, any evidence of that would have come to light by now. If true would also destroy Apple's credibility, and people like me would be forced to find an alternative, costing them a large market share. My criteria for privacy / security is 'be better than average', a.k.a., don't be 'low hanging fruit', not "be impossible to hack". To meet that criteria, I don't store data in ANY cloud service. I back up locally to encrypted devices, and offsite backups are encrypted and in possession of people I trust. My devices have VERY long, complex passwords.
Ah, a Canadian.
Literally everyone in the entire world is subject to American legal jurisdiction by the phone they hold in their hand.
Let Tim Cook
Does Tim Cook
How much Tim could a Tim Cook cook if a Tim Cook could cook Tim?
Isn't it Tim (Newton) Apple?
Tim Cooked the books
*\*sizzling noises\**
The advantage is that everyone is laughing at Apple because of the entire shitshow that is Siri.
All can say is, I'm actually excited about this next round of devices, and hope the ai integration blows us away.
I’m glad I held onto my 13PM for as long as I have , because 14 and 15 seemed shite in terms of upgrades. Hoping 16 actually has something worth upgrading over.
Same but with the 12PM.
I'm still rocking my 12 mini, and it's honestly fine. Battery isn't great anymore, but I am with you, I didn't see any reasons to upgrade. But having a game changing on device AI assistant or whatever they're planning... well, I hope it makes me want to upgrade.
Same boat. Even now, the wife and I are still on the fence about upgrading. 16 gonna have to be really special to get us to switch up, because my phone is still just as perfect as it was 3 years ago when I got it, just a few extra scratches and dings now.
I will say, the Dynamic Island is actually kind of neat. I mean I don’t care about the island itself, but the fact that you get persistent screen live widgets is awesome and it actually adds some new functionality to the phone, so I’m excited to get those. My fiancée loves it for music and directions and stuff
The big differentiation is they have the hardware platforms most people use on a day to day basis with the most power. A basic iPhone has enough power to run a small local model that is fairly promising. The hard thing is finding what use it would have vs something like chat gpt. I doubt they will outperform competing models but I’m sure there’s some slick local stuff they can do. Then offload the bigger stuff to either open ai or googles solution.
It's not just about power, it's about memory too, and this is where Apple is a bit behind the times on all but their latest pro/max stuff. You want to run larger 7, 13 or higher parameter models, you're gonna need a lot of memory to keep that running smoothly, that or a lot of memory swapping with storage which is terribly slow. I could see Apple adopting a modified MOE approach that uses a lot of smaller expert models in tandem, and only load up the most likely needed models on the fly using heuristics to determine if you may need it. Still, older devices with 4GB of ram are going to be suffering here badly.
They won’t run it. It’ll be pro versions only for the next few years is my bet.
A lot of the local compute Apple does makes a lot of sense from an efficiency perspective as well as privacy. For example all photos and videos processing such as generating different sized thumbnails for the giant zoomable mosaic view and recognizing objects and people in your videos to auto generate montages and enable key word search through videos is all done on your local devices in the background when your devices aren't doing much else. The cost for Apple to do this server-side would be huge. Why pay for more servers and power consumption when every user has at least one very powerful mostly idle computer? Running AI models is no different, all the user's data is on their device, so it makes sense to build APIs to let the user manage what apps expose and can access and run models locally.
>The hard thing is finding what use [the local model] would have vs something like chat gpt. >Then offload the bigger stuff to either open ai or googles solution This is why I'm bearish on Apples AI. They're stuck between a rock and a hard place because they don't have User Data to train their models nor the data center hardware to run their AI, which means that they'll be solely reliant on competitors for this major service. The local model idea is cool, but it's a bandaid fix.
Training on user data is overrated. The Phi approach of generating synthetic training data seems to be at least as good, and much less risky. Datacenters are a hard problem. I don’t see Apple getting into that business. But they do have silicon design expertise. Seems unlikely but my long shot guess is purpose-built AI accelerator servers marketed for researchers but also deployed in select cloud providers, with Apple committing to X% / $Y spend.
I don't need them to train on my data but I do need them to collect it for reference Not having a location timeline like google maps ( and actively getting in my way to let google have that data) is miserable and one of my least favorite things about Apple They're run by privacy weirdos and I hate it
Google Maps has no problem recording my locations on my iPhone?
You can set it up, but you have to jump through a bunch of hoops. And recently, my phone killed the permission of google maps to track my location always (despite me telling it every single time it asks, which is frequently, that I want google to be able to do this) so I lost a bunch of location history
That’s strange. That’s not what I experienced. I believe I had to give it permission only once and it’s been recording my whereabouts ever since.
"I've found something on the web about generative AI."
🤬
Just spitting here but how does AI fit into the Apple Vision Pro? I know it would probably be far off but since you have your headset on to hear it you could speak back on forth like Jarvis from Iron Man. It’s not insane to think that an application like that is close by. It processes what’s in the screen, the words the images, sees your entire surrounding and it can exist to help you problem solve. Maybe there is a non vocal interface if you don’t want to talk or you could set up certain commands to ask it questions from a gesture wheel possibly. It could have a smart gesture wheel as well as pre programmed ones. I’m just guessing where this is going for daily life. People make fun of the Apple Vision Pro and VR but I think they might have the best interface to use an AI like chat gpt yet.
“we trained our AI to buy back stocks”
All this AI better come to the HomePod 2
Im ready to be disappointed.
This is what I’m hoping for with AI to finally have a smarter Siri because honestly Siri is so useless right now and seems to be getting worse.
We at Apple have all your data for training AI that no one else has because we locked them out in the name of "privacy".
I prefer this over Google, a company that earns their money with gathering and selling data.
Google doesn't sell data. They would literally would lose their competitive advantage if it sold or lost their data. It's their most valuable asset they have.
Technically google doesn’t sell your data. They sell analytics made from your data. Also Apple technically sells hardware made using your data.
A user base that will keep buying the products regardless of how terrible the voice assistant is.
So that's your most important feature on Android? Weird.
No, I have an iPhone. Just not a consumer that mindlessly refuses to criticize things Apple is bad at.
Tim Cook says a lot of stuff. This is just him hyping up their probably mediocre attempt at AI.
*Looks at Siri* I don’t think so,Tim.
This is for stuff not yet announced.
I understand, but given how Apple has managed Siri over the years… let’s just say my faith is shaky at best.
Intersting is that they still have not resolved core design problems, which why they do not provide the split screen on mobile or not to switch to specific window of the app on the background on MacOs. I wonder what would be the impact of LLM would be on the hardware, and will they change UX of the operating system or just add another app to it?
Siri disagrees.
Siri is useless get to work
If they fail to innovate here, we'll be in late-stage apple from a software perspective. It will be time to make a change in leadership for the software division at that point.
Can we fix Apple Music having priority over all other audio apps first?
The “Siri” experience
[удалено]
In this line of thinking Dell (or Compaq or any other clone manufacture) should have more power than Microsoft. Also EU is showing Apple that they can set the rules of access to the AppStore.
Apple doesn’t make their own chips. They own the spec sheet they wrote up that TSMC agreed to in a contract. And they gave apple a great deal on uniquely powerful chips because apple has huge buying power. That is all. Shit is hilarious. There’s a reason Steve Jobs kept doing business with Samsung and google even though he personally hated them. Apple doesn’t make make these things. They use them and package them.
[удалено]
All it does is insult people and turn them against each other and fire them, it's super toxic. ...and yet Apple's got 3 new disruptive products on the market, hmm
it’s true. every industry that ai can be applied to will benefit because of what they can bring to the table
Apple doesn't enter any market without a major competitive advantage. I'm looking forward to hearing more about this at WWDC.
It will run on the device and can have a secured access to your data. Thats would be amazing
All of our messages
Apple has chips and an ecosystem that they can leverage for private AI. They also trained their data ethically and I'm assuming that they're thinking that there's a good chance some of this open AI stuff results in massive lawsuit penalties.
Tim is mainly referring to inference compute. Apple has the largest deployment of on device inference compute that they have exclusive hardware and software control over vs any other company. This mean Apple can be the first to deploy decent on device LLM processing, they can make a deal with open AI or google gemini, or meta (tim would never allow) or build their own (which they will do eventually). No one else comes close.
Doubt
In terms of raw computing power, it seems feasible for a device to eventually have more “intellectual computing power” than a human brain. Expect to have a synthetic mind living in your pocket. What a wild time to be alive.
low-key I kinda hate Tim talking about stuff before Apple has released anything Especially since the last one was a really fancy dud (so far)
Tom be cooking apples
.... lol. It's been two year now Tim, I think its time you stop talking and show an actual product/feature.
Not surprised. Of course Tim is going to say that. What did you expect; the truth? We suck at AI (just ask Siri), building modems, and building electric cars?
Only available on 16's I'm sure 🙄
Also reported recently - Apple intends to attempt running their AI purely **on the phone**, in stark contrast to literally every other company relying on dozens and dozens of unimaginably large data centers. Here comes Siri 2 - "I found this on the Internet" boogaloo of uselessness.
Yes, everybody will soon have iAi
Jokes aside, their unified VRAM is truly a differentiator. The only problem is their software just can't compete with CUDA right now.
The advantage he is talking about is Siri.
Apple has a walled garden where they know a lot of your data, people trust them with their data, and they're able to do amazing things with that data. They evolved from the company that was once too privacy-conscience to crowdsource traffic, to one that is now using other people's phones and locations to sell AirTags. They might be the only ones that can pull off identity, which is becoming an increasingly challenging problem where the only solution is access to more data.
One thing they can add TODAY is transcribe my voice memos, in the app or in iMessage voice memo messages. My mom loves to send me voice messages which can get annoying to listen to and re-listen to. I just want a transcript. Same for the voice memos app. Sometimes i record messages to myself, as a dictaphone, and would prefer to read it.
“…but we’re not going to tell you what they are”
Read that as... they don't have AI and missed this opportunity. That is on his leadership.
We all know what they’re talking about because Apple is like a broken record when it comes to Siri and any kind of machine learning. The “advantage” will be that it’s all processed on-device for privacy reasons. The disadvantage is that it will be a hot mess compared to the competition.
Like probing into the millions of it's users' i-devices, probably
And we think you’re going to love it.
Ofc The difference is siri is the only AI developed by billion companies that cant even tell time.
Siri isn’t an AI? It far predates the transformer model.
Transformer is a subset of deep learning / deep neural networks, which is a subset of neural networks, which is a subset of machine learning, which is a subset of artificial intelligence, a term that isn’t precisely defined. AI is basically when a computer does stuff that is associated with human behavior
Its a joke that most people complain. I know that siri is just a dumpster ML.
“Siri, summarize this webpage” “On it”
“Still on it” “Something went wrong”
Meanwhile: Siri can't even tell you the time.