How to use GitHub
- Please use the 👍 reaction to show that you are affected by the same issue.
- Please don't comment if you have no relevant information to add. It's just extra noise for everyone subscribed to this issue.
- Subscribe to receive notifications on status change and new comments.
Steps to reproduce
- Share any file from another application, by using the native share dialog and choosing "Nextcloud".
- Swipe right/left with one finger while the VoiceOver screen reader is on.
- (Optional) To see something closer to how it is supposed to work properly, cancel the process and navigate to the "Files" tab of the app itself.
Expected behaviour
VoiceOver should read the same information present on screen, as in step 3 (regular "Files" tab).
Actual behaviour
In step 2 (sharing screen after choosing "Nextcloud"), VoiceOver does not read the name or details of any folder, ahtough they are visible on screen and can be selected, both with and without VoiceOver. The actual issue is that a VoiceOver user cannot know which folder they are focusing in each sswipe.
Logs
Reasoning or why should it be changed/implemented?
due to this critical accessibility issue, users relying on VoiceOver, and potentially also on other assistive technologies which use programmatically available data in order to convey information to their users, cannot upload files autonomously by sharing them with the app. They have to get support either from a third person or from screen recognition features of the screen reader, which are often sluggish and not especially reliable.
this issue is in nature related to #3069, which is also blocking for users of assistive technologies but in a different screen and not exactly in the same way.
Environment data
iOS version: iOS 26.3.1
Nextcloud iOS app version: 33.0.7.5
Server operating system:
Web server:
Database:
PHP version:
Nextcloud version:
How to use GitHub
Steps to reproduce
Expected behaviour
VoiceOver should read the same information present on screen, as in step 3 (regular "Files" tab).
Actual behaviour
In step 2 (sharing screen after choosing "Nextcloud"), VoiceOver does not read the name or details of any folder, ahtough they are visible on screen and can be selected, both with and without VoiceOver. The actual issue is that a VoiceOver user cannot know which folder they are focusing in each sswipe.
Logs
Reasoning or why should it be changed/implemented?
due to this critical accessibility issue, users relying on VoiceOver, and potentially also on other assistive technologies which use programmatically available data in order to convey information to their users, cannot upload files autonomously by sharing them with the app. They have to get support either from a third person or from screen recognition features of the screen reader, which are often sluggish and not especially reliable.
this issue is in nature related to #3069, which is also blocking for users of assistive technologies but in a different screen and not exactly in the same way.
Environment data
iOS version: iOS 26.3.1
Nextcloud iOS app version: 33.0.7.5
Server operating system:
Web server:
Database:
PHP version:
Nextcloud version: