Visual intelligence can now be mapped to the action button on the 15 pro and 15 Pro Max
By Stephen, 8 April, 2025
Forum
Other Apple Chat
So, I have the iPhone 15 pro max, and I just noticed that now I can use the action button for visual intelligence and I must admit it is pretty awesome the way it connects to chat gpt.
All Apple Intelligence features work with ChatGPT.
With the visual intelligence, you can get it to describe your surroundings, ask follow up questions, read text, google search etc etc. for example, if you’re in a store, you can use visual intelligence to find that specific item in other stores and do price matching. I actually just noticed this feature on my 15 promax this morning so I only had about 15 minutes to play around with it before I had to go to work.
I have an iPhone 15 Pro and would like to use Visual Intelligence.
Unfortunately, I like to use the action button to quickly turn ring/silent mode on/off. I know some people have put together shortcuts for mapping two actions to the action button, but that seems a bit clunky.
I thought that one should be able to long press on the Camera app or long press on the Camera in the Control Center to bring up Visual Intelligence as one of the options, but that seems not to be the case. Was this an oversight of Apple's or am I missing something?
The visual intelligence button should just automatically be in your control center… I know it was for me.
I decided to map it to the action button because I keep my phone on vibrate anyways.
@Stephen
Hey, thanks for that. I thought I had looked in the Control Center before and not seen it. But after you suggested looking there, I found Visual Intelligence in the Control Center after all.
Thanks a bunch for that. Seems like it could be a useful feature.
I thought you could already bring it up by holding the camera button for long enough, seems to work on my 16 pro. I get the options ask, capture a photo or search.
I’m talking about the iPhone 15 pro and the iPhone 15 promax. Those phones do not have the camera button and in the beginning visual intelligence was only going to be for the 16 series but they decided to give it to the 15 pro series.
Ah, my bad. For some reason I thought the camera button was an iphone 15 thing not 16. Probably because I'd been so used to being on a slightly older model so I got too accustomed to not having the latest bell or whistle added on the most recent model.
I know it’s a little confusing lol. So the 15 pro in the 15 Pro Max got introduced with the action button, which is to replace the switch on the side. Then the 16 models came out with the action button and this new camera button which quite honestly sucks but that aside, I can’t see any reason why people would go from a 15 Pro max to a 16. It’s so pointless at this point.
First, this is awesome. I didn't even know this was on the control center. Second, so far, it's very underwhelming. I'm wondering if there is a way to customizewhat it looks at? For instance, I took a picture of my office, a keyboard, braille display, monitor, mouse, mouse pad, computer tower, water bottle. I tried twice. FIrst, it told me the specs of the water bottle. Second, it told me about the braille display. It ignored eveyrthing else. Compare that with Be my AI, which gave me extensive descriptions and I was a little disappointed. SO far apple intelligence has been really underwhelming, and this is just one more reason why. I'd love to know what I'm doing wrong. Also, I am NOT using the search feature. Also also, there's no way to ask it follow up questions. The button's there, but the keyboard doesn't open, and the app crashes. I am on the latest IOS.
I tried Visual Intelligence for my forthcoming iPhone 16e review, and I was not at all impressed. Getting a description of the picture from ChatGPT took my asking a question of it, and the search option gave me irrelevant results. Compare that with mapping the "Describe Fully" Be My Eyes shortcut to the Action Button, where I can press and hold the Action Button and have a picture taken and described all in one step.
I only tried it once too, so far. I think our issue is Apple Intelligence is optimized for a different use case: it is assuming the user is pointing at something specific they want more information about or take action on. Not a general description to start. For example, I showed it a packet of Idahoan Instant Mashed Potatoes. What I wanted was the name, to be sure it was potatoes and not rice (they feel similar.) Apple Intelligence pulled out the phone number and a website, but forgot to tell me the name.
I think a previous poster is right, we need the ability to set some default prompts. Maybe in iOS 24. It'd be great to use an on-device model for at least easier stuff as it'd always be faster than sending to the cloud.
Comments
Explanation.
Hi Stephen:
Would you tell us more about how visual intelligence is working with Chat GBT? Thanks.
@ gailisaiah
All Apple Intelligence features work with ChatGPT.
With the visual intelligence, you can get it to describe your surroundings, ask follow up questions, read text, google search etc etc. for example, if you’re in a store, you can use visual intelligence to find that specific item in other stores and do price matching. I actually just noticed this feature on my 15 promax this morning so I only had about 15 minutes to play around with it before I had to go to work.
How to get to Visual Intelligence without Action button
I have an iPhone 15 Pro and would like to use Visual Intelligence.
Unfortunately, I like to use the action button to quickly turn ring/silent mode on/off. I know some people have put together shortcuts for mapping two actions to the action button, but that seems a bit clunky.
I thought that one should be able to long press on the Camera app or long press on the Camera in the Control Center to bring up Visual Intelligence as one of the options, but that seems not to be the case. Was this an oversight of Apple's or am I missing something?
--Pete
@ peter
The visual intelligence button should just automatically be in your control center… I know it was for me.
I decided to map it to the action button because I keep my phone on vibrate anyways.
Re: How to get to Visual Intelligence without Action Button
@Stephen
Hey, thanks for that. I thought I had looked in the Control Center before and not seen it. But after you suggested looking there, I found Visual Intelligence in the Control Center after all.
Thanks a bunch for that. Seems like it could be a useful feature.
--Pete
Camera button
I thought you could already bring it up by holding the camera button for long enough, seems to work on my 16 pro. I get the options ask, capture a photo or search.
@ Icosa
I’m talking about the iPhone 15 pro and the iPhone 15 promax. Those phones do not have the camera button and in the beginning visual intelligence was only going to be for the 16 series but they decided to give it to the 15 pro series.
@ peter
They really snuck that in soooo quietly lol.
@Stephen
Ah, my bad. For some reason I thought the camera button was an iphone 15 thing not 16. Probably because I'd been so used to being on a slightly older model so I got too accustomed to not having the latest bell or whistle added on the most recent model.
@ Icosa
I know it’s a little confusing lol. So the 15 pro in the 15 Pro Max got introduced with the action button, which is to replace the switch on the side. Then the 16 models came out with the action button and this new camera button which quite honestly sucks but that aside, I can’t see any reason why people would go from a 15 Pro max to a 16. It’s so pointless at this point.
Any way to customize prompts?
First, this is awesome. I didn't even know this was on the control center. Second, so far, it's very underwhelming. I'm wondering if there is a way to customizewhat it looks at? For instance, I took a picture of my office, a keyboard, braille display, monitor, mouse, mouse pad, computer tower, water bottle. I tried twice. FIrst, it told me the specs of the water bottle. Second, it told me about the braille display. It ignored eveyrthing else. Compare that with Be my AI, which gave me extensive descriptions and I was a little disappointed. SO far apple intelligence has been really underwhelming, and this is just one more reason why. I'd love to know what I'm doing wrong. Also, I am NOT using the search feature. Also also, there's no way to ask it follow up questions. The button's there, but the keyboard doesn't open, and the app crashes. I am on the latest IOS.
Not Impressed
I tried Visual Intelligence for my forthcoming iPhone 16e review, and I was not at all impressed. Getting a description of the picture from ChatGPT took my asking a question of it, and the search option gave me irrelevant results. Compare that with mapping the "Describe Fully" Be My Eyes shortcut to the Action Button, where I can press and hold the Action Button and have a picture taken and described all in one step.
@ Michael Hansen
I completely agree. The be my eyes option is so much better now that I’ve had a chance to play around with it a bit more.
Apple Intelligence
I only tried it once too, so far. I think our issue is Apple Intelligence is optimized for a different use case: it is assuming the user is pointing at something specific they want more information about or take action on. Not a general description to start. For example, I showed it a packet of Idahoan Instant Mashed Potatoes. What I wanted was the name, to be sure it was potatoes and not rice (they feel similar.) Apple Intelligence pulled out the phone number and a website, but forgot to tell me the name.
I think a previous poster is right, we need the ability to set some default prompts. Maybe in iOS 24. It'd be great to use an on-device model for at least easier stuff as it'd always be faster than sending to the cloud.