Has anyone done a review of the point and speak feature with lidar yet? if so, how good does it work with voiceover?
By Josh Kennedy, 19 July, 2023
Share Your Bug Reports with AppleVis
If you experience a bug in Apple Beta Software, the most important thing you can do is to file a report in Feedback Assistant. Please do not assume that Apple is already aware of a bug or that your report won't make a difference; your report may provide crucial information, and submitting bug reports directly to Apple helps them reproduce and resolve the issue faster. The more reports they receive, the higher the priority Apple is likely to give to fixing the bug.
If you are using beta software, please use the Feedback Assistant on your device to submit feedback and bug reports. If you are using a public release, this blog post explains how to report the issue and what information to include.
In 2025, AppleVis launched the Community Bug Reporting Program with the goal of improving the quality and quantity of information we share with the community about bugs in Apple software releases. If you would like to contribute to this effort, please submit your reports via our Community Bug Report Form.
All reports shared with AppleVis must first be filed with Apple via Feedback Assistant. More information about the Community Bug Program is available here.
Comments
I would love it if Applevis would review this
Thomas could you please do a podcast on this? Thanks.
It's a beta, so...
I don't think it would be feasible at this time since the feature is in beta. Things can change, and apple could pull it for whatever reason. I think it would be a good idea to wait until the RC is released so you get a more realistic feel for the feature.
they won't pull something that was announced for GAAD
They won't pull something that was announced for GAAD.
If it's not ready
Not pull it as in not ever releasing it. I meant delaying the feature. Google Bard has the following to say on the topic.
Apple delayed a number of features during its last release cycle, including:
Matter support: Matter is a new smart home connectivity standard that will enable compatible accessories to work together seamlessly, across platforms. Matter support was originally scheduled to be released in iOS 16 and tvOS 16, but it was delayed until sometime later this year.
Focus filter: Focus filter is a new feature for iPadOS 16 that will allow users to set boundaries within Apple apps like Calendar, Mail, Messages, and Safari to draw boundaries for each Focus they enable. Focus filter was also delayed until sometime later this year.
iCloud Shared Photo Library: iCloud Shared Photo Library is a new feature that will allow users to share their photos and videos with other people in their iCloud Family Sharing group. iCloud Shared Photo Library was originally scheduled to be released in iOS 16, but it was delayed until later this year.
Live Activities: Live Activities is a new feature that will allow users to see live updates from apps like sports scores, stock tickers, and delivery tracking right from their lock screen. Live Activities was originally scheduled to be released in iOS 16, but it was delayed until later this year.
AirPlay to hotel rooms: AirPlay to hotel rooms is a new feature that will allow users to mirror their iPhone or iPad screen to a TV in a hotel room. AirPlay to hotel rooms was originally scheduled to be released in iOS 16, but it was delayed until later this year.
Apple has not yet announced a specific timeframe for when these features will be released. However, the company has said that they are still working on them and that they will be released "later this year."
The YouTube channel Zolotechβ¦
The YouTube channel Zolotech did a brief overview of the new accessibility features in 17. The brief demo of point and speak sounded promising, but there was a lot of things I couldn't pick up visually. I left a comment asking a bunch of clarification questions but haven't gotten a reply yet.
It's worth noting that this was done by a sighted person so they might not put it through some of the things we would. The guy usually does a good job reading text and describing what he's doing though. I know at least one other forum member here comments on his videos.
works pretty well
I've tried it on a stove with a flat panel with buttons, an air-conditioner, a toaster oven, and an instant-pot. It works pretty well as far as accuracy goes, but the process of using it is pretty cumbersome. You have to keep your phone parallel to the surface you are trying to read while using your other hand to point at the button you want identified. This strikes me as a feature that will work way better on something like the apple-vision pro where the camera is on your head and you have your hands free. Tested on an iPhone 14 pro.