Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Accessibility Documentation

Posts under Accessibility tag

133 Posts
Sort by:
Post not yet marked as solved
1 Replies
263 Views
My app language is set to French. If my device language is set to French, I do not see this issue. If my device language is set to English, In the app, voice over reads all UIKit views with female voice (French accent), and all SwiftUI view with male voice (French accent). Is this a bug with SwiftUI or UIKit ? I would expect the gender voice should not be changed irrespective of UIKit/SwiftUI views. I tried many ways, but none fixed the issues. I am not sure what is causing this or how to fix this. Our app is majorly built in UIKit and we started to use SwiftUI views for small views, and if this accessibility issues continue, we would like to stop using SwiftUI. Some of the ways I tried are: Set the locale of swiftUI view .environment(\.locale, "fr-CA") I could not find anything in the documentation that could address this.
Posted
by Rupika.
Last updated
.
Post not yet marked as solved
0 Replies
250 Views
I've been working on migrating some graphics to Swift Charts for an app I work on. However I've been noticing strange behavior when it comes to VoiceOver. If I create a bar chart and use: BarMark(x: .value("Month", x, unit: .month), y: ...) While the chart looks fine, the voice over values seem to follow arbitrary values set for the bin. From what I can tell they are following the underlying bin values that Swift Charts uses to provide spacing between bars. For instance, this simple example: let monthlyRevenueData = [ (x: try! Date("2024-01-01T00:00:00Z", strategy: .iso8601), y: (income: 55000, revenue: 124000), id: UUID()), (x: try! Date("2024-02-01T00:00:00Z", strategy: .iso8601), y: (income: 58000, revenue: 130000), id: UUID()), (x: try! Date("2024-03-01T00:00:00Z", strategy: .iso8601), y: (income: 59000, revenue: 120000), id: UUID()), ] struct ContentView: View { var body: some View { Chart(monthlyRevenueData, id: \.id) { (x, y, _) in BarMark(x: .value("Month", x, unit: .month), y: .value("Income", y.income)) .foregroundStyle(.green) BarMark(x: .value("Month", x, unit: .month), y: .value("Revenue", y.revenue)) } } } #Preview { ContentView() } Results in the Voice Over reading "January 14th 2024 at 12 AM to January 28th 2024 at 12am ..." despite the fact that the data should be for the entire month of Jan. Is there any way to get VoiceOver to read the input data rather than relying on how the chart is formatted? Preferably without the need to remove all visual spacing between the bars. Video link: https://drive.google.com/file/d/11mxCl3wR2HzoOaihOvci-vZk4zgG1d39/view?usp=drive_link
Posted Last updated
.
Post not yet marked as solved
2 Replies
374 Views
When Full Keyboard access is enabled, the currently focused element is indicated by a thick border (first screenshot below). If the focused element is inside a focus group, e.g. a UIScrollView, then the thick border encloses the entire focus group, and the focused element is indicated by a change in background color instead (second screenshot below). These two types of focus state seem to use the tintColor of the element. We were advised that the change in background color does not meet WCAG standards since the contrast ratio between the non-focused state and the light blue focused state is not high enough. Apart from changing the tintColor, is there any other way to customize the focused appearance of an element? It would be ideal if we could apply a border to the focused element even when it's contained in a focus group, rather than just changing the background color.
Posted
by glow.
Last updated
.
Post not yet marked as solved
0 Replies
230 Views
I want to trigger a "pinch" where I can select whatever it is I'm looking at via pressing spacebar on a bluetooth keyboard that is paired with the apple vision pro. Is this feasible in Xcode or Unity (VisionOS development) where I not only have access to see native keyboard presses, but also can see the gaze position (or what it's currently highlighting/focusing on) without accessing it through an interaction event. If keyboard press, then select item being gazed at. We wont have "pinch" available for people with hand/arm impairments that will trigger an interaction event, and there are an array of accessibility devices that send keyboard commands that would be able to aid in this. Apple should have this ability in it's native settings, specifically with using keyboard commands as well. This app would demonstrate that ability without it being native yet, and any help would be greatkly appreciated :)
Posted Last updated
.
Post not yet marked as solved
1 Replies
522 Views
I'm testing Full Keyboard Access in my app and on the iPhone apps in my iPhone 12 mini with OS 17. My work will directly impact how much accessibility review is done on our iOS app which has millions of unique views a month. In several Apple apps I cannot seem to scroll down through the screen when the main View has focus. For example, the Home app does not scroll with arrow keys nor Ctrl+tab through any of the 6 main content groups on the Discover screen. it almost appears it's a single static image; the "Getting Started" button is not able to be activated. I can activate sections further down when I enable gestures, but cannot pinpoint a specific location. The Stocks app includes Top Stories from the Apple News app; in either app I can select a story, which brings up the article on full screen, but then I cannot use the arrow keys or Ctrl+tab to read the article or interact with inline links. Ctrl + tab selects the button features like to watch an embedded video or live coverage, then jumps down to the end of the article to focus on Related stories, ignoring all the links in between. I am able to somewhat move through the article text with keyboard gestures, but many of these articles have embedded links or content after the article (before "Related Stories" I work in digital accessibility and need to be able to tell my teams what is expected behavior and where to see examples of this. If Apple can't demonstrate Full Keyboard Access in its own apps this is a problem. Our own app has some of these issues but I am unsure how to recommend a solution when the scrollview seems to not work in native iOS apps by Apple.
Posted
by jsponger.
Last updated
.
Post not yet marked as solved
1 Replies
307 Views
The structure of the UI is a bit complicated. I'll do my best to explain. There is a UITableView that has a sibling in its view hierarchy. Footer buttons that come on top of the UITableView at the bottom of the screen. I am using a one finger swipe gesture to iterate over different elements on the page and in the table view. Each cell in the UITableView has a UIViewController which has a UICollectionView. This UICollectionView has cells that have multiple views nested inside with most of them being dummy views. The cells have different structures and the voice over works well across them without any customisation and is able to identify all the right accessibility elements. Now the problem comes in the last cell on the page. Imagine it has 2 UILabels and 2 UIButtons. When navigating using normal voice over and not defining any accessibilityElements, the order is weird so I added override var accessibilityElements: [Any]?{ get{ return [label1, button1, label2, button2] }set {} When navigating to this cell, everything works fine but once an element inside this particular last cell is highlighted it gets messed up. The order works fine but the voice over ends up looping inside the cell. It doesn't go back to the other cells or navigate to the footer of the page. If I remove the accessibilityElements array then everything is fine but not in the correct order. Anybody know why that might be and how to break the loop? It would be helpful if I could know how voice over recognises which view to navigate to next.
Posted
by mrikh.
Last updated
.
Post not yet marked as solved
1 Replies
330 Views
Is there any way for a 3rd party macOS app to receive some sort of notification for a change to the Text Size accessibility setting in the Settings app? I have not been able to find any API for this. Several Apple apps (Mail, Notes, and others) update text size based on the setting. I'd like to do the same in my own macOS app.
Posted
by RickMaddy.
Last updated
.
Post not yet marked as solved
2 Replies
285 Views
We are using UIDocumentInteractionController to preview a pdf. When you navigate on your phone with the hardware keyboard and it focuses on the pfd preview, there is no way we can reach the buttons in the toolbar anymore like the share and done button. Is this a bug or is there a way to get to the navigation/toolbars? iOS17.0 iPhone14 sim and real device.
Posted
by Jeroen74.
Last updated
.
Post not yet marked as solved
1 Replies
221 Views
I'm looking for an accessibility modifier (or some other method) in SwiftUI that does the same job as UIKit's accessibilityLanguage property: https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage We've got a few screens in our app for which the display language is server-dictated instead of device-dictated -- and without this property VoiceOver is reading Spanish with the English parser and accent when the device is set to English. Thanks for any information.
Posted
by Purseus.
Last updated
.
Post not yet marked as solved
2 Replies
307 Views
I'm experiencing an issue with the Accessibility Inspector. It's worked fine. Since the update, I'm unable to perform an audit while in the Simulator. I have the following message: "Select a target app to view Accessibility warnings and audit information." However, VoiceOver and Dynamic Type functionalities are working fine. The audit only works when I test on my iPhone. How can I connect the target for the audit? Thank you. Accessibility Inspector V5 Simuator 15.2
Posted
by jcaero.
Last updated
.
Post not yet marked as solved
0 Replies
286 Views
Hello everyone, I’m currently working on a project where I need to simulate mouse drag and drop events in MacOS using Swift. I have written the following code which works well in most applications, but I’m encountering issues in some applications like Finder where the leftMouseDragged event doesn’t seem to work correctly. Click events are working fine everywhere. Here’s the code I’m using for the drag event: let location = CGEventTapLocation.cghidEventTap let source = CGEventSource.init(stateID: .combinedSessionState) let clickAtStart = CGEvent(mouseEventSource: source, mouseType: .leftMouseDown, mouseCursorPosition: locationPoint , mouseButton: .left) let dragAndDrop = CGEvent(mouseEventSource: source, mouseType: .leftMouseDragged, mouseCursorPosition: locationPointDrag, mouseButton: .left) clickAtStart?.post(tap: location) usleep(200000) dragAndDrop?.post(tap: location) And analogic for drop event: let location = CGEventTapLocation.cghidEventTap let source = CGEventSource.init(stateID: .hidSystemState) let dragAndDropRelease = CGEvent(mouseEventSource: source, mouseType: .leftMouseUp, mouseCursorPosition: locationPoint, mouseButton: .left) dragAndDropRelease?.post(tap: location) I’ve tried changing the order of events, using different event types, and checking the permissions of my app, but none of these solutions have worked. I’m wondering if anyone has encountered a similar issue or has any suggestions on how to resolve this. Any help would be greatly appreciated!
Posted
by XVC_.
Last updated
.
Post not yet marked as solved
0 Replies
311 Views
I am using default AVPlayerviewcontroller with the default playercontrols, skip buttons for video streaming in tvOS app. Custom controls/buttons are not being used. Can we override the AVPlayerviewcontroller voice over accessibility text/behaviour in for the default player controls? I am not to find out any apple's documentation on this ands not sure even its possible.
Posted Last updated
.
Post marked as solved
2 Replies
312 Views
Background I have a UICollectionViewCell, where I override accessibilityElementDidBecomeFocused method. I also override canBecomeFocused to always return true. The App allows Accessibility->Full Keyboard Access (allowing the user to use the app with just Keyboard). isAccessibilityElement is also set to true for the collectionviewCell Problem I have two scenarios When both Keyboard access and VoiceOver are enabled In this case accessibilityElementDidBecomeFocused gets called as expected. When both Keyboard access is enabled but VoiceOver is disabled In this case accessibilityElementDidBecomeFocused isn't getting called. Is this the expected behaviour? If yes, then is there a callback available when a view element gets focused via Keyboard with VoiceOver disabled?
Posted
by mdar18.
Last updated
.
Post not yet marked as solved
0 Replies
301 Views
My sandboxed macOS app requires the user to grant permission under Privacy & Security / Accessibility in order to support extra functionality. If no permission is granted the app can still be used albeit with very basic functionality. In order to allow the user NOT to have to immediately decide whether to grant this permission when first launching the app, a dialog allows them to say “I’ll do it later”. As such, the app uses a timer with a one second interval to ask the system if permission has been granted and if so, implements the extra functionality. By the way, I would rather have used a notification instead of a timer, but there does not seem to be one. // Schedule a timer to periodically check accessibility status accessibilityTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(checkAccessibilityStatus), userInfo: nil, repeats: true) func isAccessibilityEnabled() -> Bool { let accessibilityEnabled = AXIsProcessTrusted() return accessibilityEnabled } @objc func checkAccessibilityStatus() { if isAccessibilityEnabled() { print("Accessibility is enabled.") accessibilityTimer?.invalidate() if gEventTap == nil { tapper()//as003 gTypeIt4MeMenu?.item(at: kPauseResumeItem)?.title = "Pause" gStatusItem?.button!.image = NSImage(named: "menubar_icon_16x16") NotificationCenter.default.post(name: NSNotification.Name(rawValue: "showGreenTick"), object: nil) } } else { print("Accessibility is disabled.") } } My problem is that when I build the app with my development certificate, it runs as expected. However, when I upload it to TextFlight and download from there, it no longer “notices” when I grant it permission.
Posted
by rettore.
Last updated
.
Post not yet marked as solved
2 Replies
277 Views
I have collection view with hierarchical data source. Because of that, I create some cells with UICellAccessoryOutlineDisclosure accessory with style UICellAccessoryOutlineDisclosureStyleCell, so one can either tap on cell to open detail view or tap on outline disclosure accessory to reveal hidden child data. Question: How should I configure outline disclosure accessory to work with VoiceOver on? It works fine without VoiceOver, but with VoiceOver it seems, that any gesture always leads to opening detail view.
Posted
by kpuchar.
Last updated
.
Post not yet marked as solved
0 Replies
313 Views
As the document mentioned "Host your domain-verification file at the following path for each domain you’re registering: https://[DOMAIN_NAME]/.well-known/apple-developer-merchantid-domain-association" and that file contain "pspId" and "signature" so just want to know these are not sensitive it gets in the hands of an attacker ??
Posted
by Martizz.
Last updated
.
Post not yet marked as solved
0 Replies
212 Views
Hi, I'm currently registering notifications on numerous AXUIElementRefs. I would like to find a timestamp of when each event occurs, however, I cannot find a reliable way to do so. Getting a timestamp when the callback is called isn't reliable because the order of callback execution is arbitrary. I know the run loop API is mostly open sourced, and this is a bit of a reach, but is it possible to hook into the CFRunLoopSourceSignal call from the AXObserverRef? Somewhere in the Apple API stack these notifications are being triggered. My question is, do they record the timestamp and are there any public or private APIs to gather this information? My goal is to reliably gather in what order certain events happen (e.g. window move, focus, etc.).
Posted
by nicksal.
Last updated
.
Post not yet marked as solved
0 Replies
330 Views
I'm working on converting an app to SwiftUI, and I have a menu that used to be several table cells in a storyboard, but I moved it to an embedded SwiftUI view instead. Here's the old way (from override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell ): cellReuseID = "BillingToolsCell" let cell = tableView.dequeueReusableCell(withIdentifier: cellReuseID, for: indexPath) if let billingToolsCell = cell as? BillingToolsCell { billingToolsCell.billingToolsOptions.text = billingTools[indexPath.row].title // Accessibility billingToolsCell.isAccessibilityElement = true billingToolsCell.accessibilityIdentifier = "Billing_\(billingTools[indexPath.row].title.replacingOccurrences(of: " ", with: ""))" } return cell And here's the new way I'm creating the cell: cellReuseID = "BillingToolsSwiftUI" if let cell = tableView.dequeueReusableCell(withIdentifier: cellReuseID, for: indexPath) as? SwiftUIHostTableViewCell<BillingToolsView> { let view = BillingToolsView(billingToolVM: BillingToolViewModel()) { segueID in self.performSegue(segueID: segueID) } cell.host(view, parent: self) return cell } Here's the swiftUI view: struct BillingToolsView: View { @StateObject var billingToolVM: BillingToolViewModel var navigationCallback: (String) -> Void var body: some View { VStack { VStack{ ForEach(self.billingToolVM.billingToolList, id: \.self) { tool in Button { navigationCallback(tool.segueID) } label: { BillingToolsRowView(toolName: tool.title) Divider().foregroundColor(AFINeutral800_SwiftUI) } .accessibilityIdentifier("Billing_\(tool.title.replacingOccurrences(of: " ", with: ""))") } } .padding(.vertical) .padding(.leading) .background(AFINeutral0_SwiftUI) } } } If I check the accessibility inspector, I can see the identifier - here it is showing Billing_PaymentHistory: But when the testers try to run their tests in Appium, they don't see any identifier at all: Did I mess up setting up the accessibility identifier somehow? Or do the testers need to update their script?
Posted
by KBartlett.
Last updated
.