Introducing SunoDayko, a utility for the deaf-blind and the blind

By Adarsh Hasija, 15 September, 2020

Forum
iOS and iPadOS

I would like to share an app that I have created: SunoDayko. You can find it on the Apple App Store by searching SunoDayko. I apologize for not being able to post a link to it. The site is not letting me submit the post with a link included.

It has features that I felt would be useful for 2 groups: visually-impaired and deaf-blind. Let me explain the 3 features that I have at the moment:
-TIME in morse code. Designed for the deaf-blind. Tap once and swipe up
-DATE in morse code. Designed for the deaf-blind. Tap twice and swipe up. Not that the app will not give you the full date. It is will you the date and the first 2 letters of the day of the week. It does not provide the month or year. I did this because I felt people usually know the month or the year and do not need the assistance of an app for that. I could be wrong on this. Please share your thoughts.
-CAMERA. Designed for both the blind and deaf-blind. This can be used to get text. For example: flat numbers on doors or meeting room names on doors.
To access this, do the following:
-Tap 3 times and swipe up to open the camera
-Point the camera at a door where a flat number or meeting room name is clearly visible
-If the app reads the text successfully, the app will vibrate to indicate success. If you are visually-impaired you can tap the screen to hear the text. If you are deaf-blind, you can start swiping right with 2 fingers to read through the morse code version.

I have tried to ensure it is compatible with VoiceOver. Please take a look.

I'd like to finish by admitting that I simply designed this because I felt these features would be useful. I felt that using morse code could help the deaf-blind, at least for simple things like date and time. The morse code functionality can also be an alternative to carrying around a braille keyboard.

For the camera feature, it works best when the text you are trying to get is large and there is no small text around it. Best examples are:
-Flat numbers on a door. I understand flat numbers are not always on the door, but if it is, then this app's camera can get it
-Meeting room names on doors. Another example where the text is large and there is probably nothing written around it
Finally, I will say that after opening the camera, you should wait upto 10 seconds. If the phone does not vibrate the confirm that it has succeeded, it likely did not find anything.

I designed these features based on my feeling that it would be useful. I could be wrong. That is why I am here. A friend pointed me to this site saying that the community will gladly give you feedback. So please do. Suggest any modifications you like. And if you feel that I am completely wrong and these features are not useful at all, please feel free to say that too. Thank you for your time :)

Options

Comments

By Khushi on Thursday, October 8, 2020 - 09:30

hi
talk back works on Android and it is a screen-reader specifically for android. for apple products, the screen reader is voiceover. hope it's compatible with IOS devices.
please provide more details on the app.. what its supposed to do and is this the only feature supported in this app?
thank you.

By Adarsh Hasija on Thursday, October 8, 2020 - 09:30

In reply to by Khushi

My mistake, I meant VoiceOver. Thanks for pointing it out. I have edited my post accordingly. Yes is is available on iOS devices. You can search for SunoDayko on the iOS App Store.

The app actually has 3 features, which I designed because I felt they would be useful for VI and deaf-blind. I am now trying to get feedback on these. The 3 features at the moment are:
-TIME in morse code for deaf-blind (tap once and swipe up)
-DATE in morse code for deaf-blind (tap twice and swipe up)
-CAMERA mode (tap tree times and swipe up)

Let me explain camera mode more. It can pick text that is clearly written, examples include flat number on a door or meeting room name on a door. I feel this can help VI navigate offices. Once you open the camera, point it at the door where the text is written and the app will vibrate when it recognizes the text. You can just tap the screen to hear the text.

These are the only features at the moment. I will build out more over time.

By Siobhan on Thursday, October 8, 2020 - 09:30

Hello. I haven't downloaded this app yet, partly because I haven't looked at the spelling. What concerned me is the use of the words "Clearly visible". For instance, if I were to walk back to my apartment, I am not quite sure exactly where the numbers are. So what do I do? Wave the phone around looking even more of a less than sane person then I may look to the average Joe walking by? Sarcasm aside, there isn't enough information to really make me at least want to download this. Perhaps if someone does a podcast. I did look and this is not available in the U.S. App store. I also am of a belief this app may be of a different language rather than English.

By Adarsh Hasija on Thursday, October 8, 2020 - 09:30

In reply to by Siobhan

Thats a good question. Let me address this concern regarding "Clearly visible". It means that the text you want to capture should be relatively large and there should be no small text around it. Thats why I used the examples of flat numbers or meeting room names. These are 2 examples where the text in question is big and its the only thing written on the door. There is unlikely to be other text around it.

To quote your example of walking back to your apartment, you would likely know that the flat number is on the upper half of the door. So once you reach the apartment, you would open the in-app camera and point it up at the door where the flat number is written. The app will pick it.

I understand that it maybe difficult to understand the functionality from a written description so here is a video of the camera functionality in action. Hope it makes things little clearer. https://youtu.be/PajVftd_MdI

By Brad on Thursday, October 8, 2020 - 09:30

I've not tried the app but the OP does mention that you can swipe up and the app wil vibrate the date and time, I think that should be explained a little more.

Honestly I think you're being a little harsh, the OP's trying their best, as for the US store, perhaps the OP isn't from the US and that's why it's not there yet. Maybe where the OP is from numbers are like that on doors, please remember, the US isn't the world.

By Brad on Thursday, October 8, 2020 - 09:30

I just tried it and it works ok, but there's program code in voiceover where you can swipe and tap and voiceover will act like the screen reader is off but I have no idea what that's called, you'll need to add that to your code.

At the moment, it's very hard to double tap and hold, then swipe write with two fingers, when I turned voiceover off it worked fine.

Another thing is the vibration for date and time, I don't think morse code is the best option, it's 5:10 here in the UK, what about five quick vibrations and 10 ones after. I honestly don't know exactly how good that would be.

you could try this subreddit: https://www.reddit.com/r/deafblind/ the thing is, it's last post was a couple days ago so I don't know how active it is.

Oh and to the person talking about the app not being in the US, are you sure you spelt it correctly? I found it in the UK with no issues.

By SeasonKing on Thursday, October 8, 2020 - 09:30

Hi, I am also from India. Nice to see some inhouse devs getting awareness about us. Congratulations and thank you so much to you.

By Adarsh Hasija on Thursday, October 8, 2020 - 09:30

In reply to by Brad

Thank you Brad for trying out the app. Let me address the doubts you mentioned:

OP does mention that you can swipe up and the app wil vibrate the date and time, I think that should be explained a little more.
When you tap once and swipe up, the app will give you the TIME in numbers (24 hr format) and morse code. This has been designed for the deaf-blind, as a way to let them read time without using a braille display. They can scroll right with 2 fingers to read morse code. The app communicates morse code via vibrations. This feature wasn't really designed for VI as I'm aware that they can access time from the lockscreen itself. However if they want to use this feature they can. After tapping once and swiping up, they just tap the screen again to hear the time

-Tap twice and swipe up to get DATE. Again, primarily designed for the deaf-blind. This returns the date and the first 2 letters of day of the week. Two important points to share here:
1. I do not return the month or year as I assume the user already knows that. On any particular day, they possibly do not recall the date or the day of the week
2. First 2 letters of the day of the week: I return only this because I realize it maybe cumbersome to read the morse code so I just want to provide the minimum number of characters to convey the message. So I give the first 2 characters.

3. Tap 3 times and swipe up to open CAMERA. The camera can detect text. It works best when pointed a large text with no other text around it. Examples include flat numbers on doors and meeting room names on doors. One should wait upto 10 seconds. If the device vibrates once, it means the camera was successful. It will return the text it detected and the morse code version. A VI person can tap the screen to hear the text. A deaf-blind person can start swiping right with 2 fingers. The app will communicate the morse code via vibrations

I hope this offers better clarity. Thanks for your question :)

Again, thank you Brad for your observations. Let me address these:

You said: At the moment, it's very hard to double tap and hold, then swipe write with two fingers, when I turned voiceover off it worked fine.
My response: Just to clarify, your saying a 2 finger swipe is difficult with voiceover ON? If so, thats a good point, I'll try to think of a different user interaction for the morse code

Another thing is the vibration for date and time, I don't think morse code is the best option, it's 5:10 here in the UK, what about five quick vibrations and 10 ones after. I honestly don't know exactly how good that would be.
Ok so your saying this method is better than the morse code option? Thats valuable info. I'm just gathering feedback on these features.

https://www.reddit.com/r/deafblind/
Thank you for sharing this link!

By Siobhan on Thursday, October 8, 2020 - 09:30

In reply to by Brad

Ok after having a tough time, I did download it and to be honest, no we are not being harsh. I am frustrated that after playing with the app, it has the following features. Actions button clearly labeled, also goes for everything. No easy way to do anything. It says tap the screen for a dot. I do, hear the Voice over sound, nothing else. I turned it off, swiped up and I made no headway. I'm not discounting this app, I just think this app is poorly designed and if we were taught how to use it, it would help.

By Khushi on Thursday, October 8, 2020 - 09:30

hi
I haven't downloaded the app but the app has potential.
I think you need to work on it. possibly you could gather Beta testers for this app who can suggest features and work with you on the app may be? I'm not sure really.. but I feel you can gather feedback in a more specific way by asking people to beta test the app or submitting bug report or having a bug-report team.
to be honest I don't know how well it will work.
and I think we should not discourage the developer like that :)
all the best :)

By Brad on Thursday, October 8, 2020 - 09:30

It's ok, Icame across as harsh there too.

Make sure you're spelling the name correctly when searching for it on the app store.

I've deleted the app as I have no use for it but see where you're coming from.

You're meant to tap and hold then swipe up for the extra bit from what I remember.

You're better off getting answers from deafblind people, keep in mind; they may not even want this as a feature.

I don't think hardly anyone knows morse code, I don't and I know for a fact the people around me don't either.

By Adarsh Hasija on Thursday, October 8, 2020 - 09:30

In reply to by Siobhan

Thank you for trying it out and providing feedback. I am sorry that the experience has been frustrating for you. Let me explain how the app is meant to work with VoiceOver enabled:
As you have rightly observed, when you start the app, the screen says Tap the screen for a dot.
After tapping the screen, VoiceOver should give you the next instruction, which is "Swipe up to get TIME. Add a dot to get DATE". This means that if you swipe up you get the TIME. I believe swipe up is a 3 finger swipe up with VoiceOver enabled.
If you add another dot (by tapping again), VoiceOver should tell you to swipe up to get DATE.
If you add another dot by tapping again, VoiceOver should tell you to swipe up to get CAMERA.
To summarize the interactions:
Tap once and swipe up = TIME
Tap twice and swipe up = DATE
Tap 3 times and swipe up = CAMERA

Note that only after swiping up is the action completed. For example, tap once, then swipe up, and the app will vibrate to confirm that it has understood the request for the TIME and it will give you the time (in 24 hr format and morse code). Then a VI person can tap the screen to hear the time, whereas a deaf-blind person can swipe right with 2 fingers to read the morse code.

If you feel these features are of use to you and want to give the app another try I would appreciate that. If not, its ok, I understand. I thank you for your feedback. It has made me realize that the design is probably not intuitive enough and I need to address that.

By Adarsh Hasija on Thursday, October 8, 2020 - 09:30

Thank you all for responding to my post. I apologize that my post was not clear enough about the features of the app and how it was meant to benefit the target users. I am also sorry for the frustrating experience that many of you appear to have had with the app. Clearly I need to work on the design. I will work on this going forward.

Please know that I designed this with the best of intentions. I thought I could use morse code as a way to communicate information to deaf-blind people without them needing to plug in a braille display. For the VI, I thought I could help them navigate navigate to a location, like a meeting room, by using the camera to identify the meeting room name.
I felt these features would be useful but I had nobody to validate it with. Someone recommended that I try out this site for feedback so I posted it here.

I would like to ask if anyone is willing to be a beta tester for this. Simply volunteer a few mins as and when design changes are made and features are released. If nobody is willing to, thats alright. I just thought I would ask once.