Discuss the latest Apple technologies announced at WWDC22.

Posts under WWDC22 tag

28 Posts
Sort by:
Post marked as solved
2 Replies
137 Views
When using the sample code included on the Running macOS in a virtual machine on Apple silicon. I am adding the following changes to the swift files: Added to 'MacOSVirtualMachineConfigurationHelper' file: static func createAutomountSingleDirectoryShareDeviceConfiguration() -> VZVirtioFileSystemDeviceConfiguration { let sharedDirectory = VZSharedDirectory(url: directoryURL, readOnly: false) let singleDirectoryShare = VZSingleDirectoryShare(directory: sharedDirectory) // Assign the automount tag to this share. macOS shares automounted directories automatically under /Volumes in the guest. let sharingConfiguration = VZVirtioFileSystemDeviceConfiguration(tag: VZVirtioFileSystemDeviceConfiguration.macOSGuestAutomountTag) sharingConfiguration.share = singleDirectoryShare return sharingConfiguration } Added to 'path' file: let directoryURL = URL(fileURLWithPath: NSHomeDirectory() + "Projects") Added to the 'AppDelegate' file: virtualMachineConfiguration.directorySharingDevices = [MacOSVirtualMachineConfigurationHelper.createAutomountSingleDirectoryShareDeviceConfiguration()] When the above is added and the sample app is run, the following error is shown: macOSVirtualMachineSampleApp/AppDelegate.swift:95: Fatal error: Virtual machine failed to start with Error Domain=VZErrorDomain Code=2 "A directory sharing device configuration is invalid." UserInfo={NSLocalizedFailure=Invalid virtual machine configuration., NSLocalizedFailureReason=A directory sharing device configuration is invalid., NSUnderlyingError=0x600000c343c0 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}} On the host device the directory that is being shared is ~/Projects and it does exist. What do I need to change to create the shared directory and have it work? Is there a sample code project for the same configuration that was shown in the demo?
Posted
by w8ster.
Last updated
.
Post not yet marked as solved
1 Replies
388 Views
Hello, I am reading up on the documentation and seems to have some discrepancies but wanted to double check. In Overview for PTChannelManager, it states "Multiple calls to channelManager(delegate:restorationDelegate:completionHandler:) result in the system returning the same shared instance, so store the channel manager in an instance variable." https://developer.apple.com/documentation/pushtotalk/ptchannelmanager However if we look at the documentation for creation of a channel manager, in the completionHandler, it states that it will return "A new channel manager instance." https://developer.apple.com/documentation/pushtotalk/ptchannelmanager/4031737-channelmanager So is it a shared instance that gets returned or will a new instance be created? Need to know if we will need to implement a multiDelegate Pattern or not for this scenario for example if someone else called this function, would they take over the callbacks or would they get their own instance? Thank you.
Posted
by lotam.
Last updated
.
Post not yet marked as solved
2 Replies
286 Views
Hi, I'm looking into ACME Managed Deice Attestation and was wondering about one of the values in the payload - AllowAllAppsAccess. From the documentation: "If true, all apps have access to the private key" but what is the case that you would have this set to true? seems like it opens up the device to potentially malicious software. Also, if this were set to true, how would an app access this private key when it is stored in the Secure Enclave? is there a specific tag that it is stored with?
Posted
by afoxon.
Last updated
.
Post not yet marked as solved
3 Replies
697 Views
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image. The way I have it setup is really simple: extension MusicKit.Song { func imageURL(for cgSize: CGSize) -> URL? { return artwork?.url( width: Int(cgSize.width), height: Int(cgSize.height) ) } func localImage(for cgSize: CGSize) -> UIImage? { guard let url = imageURL(for: cgSize), url.scheme == "musicKit", let data = try? Data(contentsOf: url) else { return nil } return .init(data: data) } } Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these: 2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.} 2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7 2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted) My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller. I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX. SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
Posted
by wiencheck.
Last updated
.
Post marked as solved
1 Replies
306 Views
I'm trying to implement ACME managed device attestation, I have ACME server code written in C# and I've been able to get all of the steps working except for the very last one - issuing the certificate. I so far have not been able to get the device to accept the certificate, the device logs show: Got certificate {length = ......} ACME request flow failed at step 9: Error Domain=NSOSStatusErrorDomain Code=-67673 "failed to obtain certificate" UserInfo={NSLocalizedDescription=failed to obtain certificate} The certificate is issued by an internal CA and the correct root certificate is in the device's trusted certs. I have tried returning the certificate chain as a file response or content response to the device as a "application/pem-certificate-chain" mime type (as outlined as the default in the ACME RFC), returning just the leaf certificate as PEM, returning the leaf certificate as DER with mime type "application/pkix-cert", "application/pkcs7-mime", "application/x-pkcs12" or "application/x-x509-ca-cert", but none of this has worked. Can anyone point me in the right direction to figure out what the issue is?
Posted
by afoxon.
Last updated
.
Post not yet marked as solved
5 Replies
1.9k Views
Hi, according this WWDC session https://developer.apple.com/wwdc22/10170 App Shortcuts are defined in Swift code, by implementing the AppShortcutsProvider protocol. To implement the protocol, I'll simply create a single getter that returns all the app shortcuts I want to set up for the user. Note that in total, your app can have a maximum of 10 app shortcuts. However, most apps only need a few. there is a limit for up to 10 AppShortcuts. Could you please clarify how that limit handled? 🤔 (e.g. project failed to build / app will crash or malfunction / only 10 shortcuts will be handled on random/ordered choice by iOS) I suppose there is some way to manage shortcuts amount but see no details at documentation yet.
Posted
by vk_arc.
Last updated
.
Post not yet marked as solved
0 Replies
503 Views
I want to be able to detect when an auxiliary window is closed in order to display a confirmation dialog box. Apple's Mail app on MacOS exhibits this exact behaviour when a user closes a message they were composing, a dialog box appears asking to save, don't save or cancel. I am programmatically opening the auxiliary windows using the openWindow action in combination with the corresponding WindowGroup initialiser as described in this article Presenting windows and spaces and the WWDC video Bring multiple windows to your SwiftUI app.
Posted
by Uasmel.
Last updated
.
Post not yet marked as solved
2 Replies
517 Views
Hello Apple Community, I've been delving into the realm of time-based activation predicates through DDM. In my recent pursuits, I've been experimenting with the device's local time to evaluate a predicate expression and apply activation configurations. Is it possible to achieve this? Our DDM currently leverages device status items and server management properties to activate predicates. These predicates come to life when the logic becomes true, initiating activations seamlessly. While the Apple Predicate Guide provides a solid foundation, I've encountered some challenges when it comes to time-based expressions. The guide covers basics such as context and numerical-based predicates, but I find myself seeking more clarity on implementing time-based logic effectively. If any of you have insights, tips, or experiences to share regarding time-based activation predicates expressions in declarative device management, your input would be immensely valuable. I'm particularly interested in understanding practical approaches and gaining a deeper comprehension of the nuances involved. Thank you in advance.
Posted
by Sithick.
Last updated
.
Post not yet marked as solved
0 Replies
388 Views
Hello, I'm currently experiencing an issue with the DeviceActivityMonitor extension in my code, specifically with the eventDidReachThreshold callback. I'm hoping to get some insights into why this problem occurs and how to resolve it. Problem: Issue 1: The eventDidReachThreshold callback is not triggering as expected. It appears that the callback is not being invoked when the threshold is reached. Issue 2: After a few seconds, the eventDidReachThreshold callback starts to trigger multiple times. This unexpected behavior is causing problems in my code, as it results in incorrect actions being taken. iOS version: iOS16.7.2 and iOS17.1 Xcode version: 15.0.1 Swift version: 5.9 Here is my code to start the monitoring: func startMonitoring() { var startTime : DateComponents = DateComponents(hour: 0, minute: 0) let endTime : DateComponents = DateComponents(hour: 23, minute: 59) /// Creates the schedule for the activity, specifying the start and end times, and setting it to repeat. let schedule = DeviceActivitySchedule(intervalStart: startTime, intervalEnd: endTime, repeats: true, warningTime: nil) /// Defines the event that should trigger the encouragement. let event = DeviceActivityEvent(applications: socialActivitySelection.applicationTokens, categories: socialActivitySelection.categoryTokens, webDomains: socialActivitySelection.webDomainTokens, threshold: DateComponents(minute: 2)) let events: [DeviceActivityEvent.Name: DeviceActivityEvent] = [.socialScreenTimeEvent : event] do { activityCenter.stopMonitoring([.socialScreenTime]) /// Tries to start monitoring the activity using the specified schedule and events. try activityCenter.startMonitoring(.socialScreenTime, during: schedule, events: events) } catch { /// Prints an error message if the activity could not be started. print("Could not start monitoring: \(error)") } } If there are any known workarounds or potential solutions, please share them. Thank you for your help in resolving this problem.
Posted
by indigen.
Last updated
.
Post not yet marked as solved
2 Replies
1.7k Views
Is it possible to see a preview of the Live Activity UI we design? For a regular widget, we pass in a WidgetPreviewContext modifier where we specify the size of the widget to preview. Is it possible to do something similar to see how the live activity would appear without having to run the app and then see how the live activity appears on the Lock Screen?
Posted
by jabhiji.
Last updated
.
Post not yet marked as solved
9 Replies
3.2k Views
The new Virtualization framework (and sample code!) are great. It's a lot of fun to run the sample code and quickly fire up multiple VMs of macOS running as a guest. However, the inability to authenticate with any iCloud services is a significant roadblock. Xcode, for example, is not allowing me to authenticate my developer account. Are there any plans to resolve this issue so that iCloud accounts can be authenticated from within a VM?
Posted
by kennyc.
Last updated
.
Post not yet marked as solved
4 Replies
1.8k Views
All of my builds get stuck on Archive action - it keeps running forever (10+ hours where it took 30 minutes before for a clean build), and never finishes despite having all subtasks finished (green check). This started to happen on a workflow that has worked reliably for months, right after WWDC22 start - is there a problem with a new version of Xcode Cloud?
Posted Last updated
.
Post not yet marked as solved
1 Replies
1k Views
I'm looking for a way to check user Screen Time and trigger a notification after a specified amount of time on certain apps. I have the code from WWDC21 but I was wondering if I could get the code sample for the Worklog app created for WWDC22. Would really help as a reference point for building my project. Thank you!
Posted
by blanny21.
Last updated
.
Post not yet marked as solved
0 Replies
450 Views
Could somebody please point me to the project download mentioned in Novall Khan’s “Meet Weatherkit” presentation? I’d additionally or alternatively be interested in any working examples for using the rest API, eg in Python, to access Weatherki. thanks, Rob
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
Hi,i have been trying out SwiftUI Table and wanted to present a details view when click on Table Row occurs, but I couldn't figure out how to "deselect" row once its been selected, while it may not be what Table was intended for, but I still think this code should be valid. (iPadOS) struct Person: Identifiable { let givenName: String let familyName: String let emailAddress: String let id = UUID() } private var people = [ Person(givenName: "Juan", familyName: "Chavez", emailAddress: "juanchavez@icloud.com"), Person(givenName: "Mei", familyName: "Chen", emailAddress: "meichen@icloud.com"), Person(givenName: "Tom", familyName: "Clark", emailAddress: "tomclark@icloud.com"), Person(givenName: "Gita", familyName: "Kumar", emailAddress: "gitakumar@icloud.com") ] @State private var selectedPeople: Person.ID? @State private var detailsViewPresented: Bool = false var body: some View {         Table(people, selection: $selectedPeople) {             TableColumn("Given Name", value: \.givenName)             TableColumn("Family Name", value: \.familyName)             TableColumn("E-Mail Address", value: \.emailAddress)         }         .onChange(of: selectedPeople) { selection in             guard selection != nil else {                 return             }             detailsViewPresented = true         }         .sheet(isPresented: $detailsViewPresented, onDismiss: {             // Trying to reset the selection             self.selectedPeople = nil         }) {             Text("Person's details")         } } Here when I press row, it gets selected and Text is presented, but row still remains selected, and yes, I could just use onTapGesture within row content if I declared TableColumn with explicit content, but it would just be added to that column and would not provide build in selection style. https://developer.apple.com/documentation/swiftui/table
Posted
by J0s34h.
Last updated
.
Post not yet marked as solved
19 Replies
5.7k Views
I had a timer app, it played white noise after starting the timer. so my app is running in the background with background audio, and the timer is perfect for display with live activity. However, when I test my code with a real device, I find calling await activity.update(using: contentState) when app is running in the background does not work at all. the code executes, but the live activity won't get updated. After some experiments, I find: if the app is running in the background with background location or Picture-in-picture mode, the app can update live activity when running in the background. If the app is running in the background with audio playing, it will work on simulator, but not on a real device. I submit a feedback: FB11683922 (Can't update Live Activity from app with ActivityKit when app is running in the background with background audio playing.) My code is like: func startLiveActivity() { // Prepare content state and attributes.     do {       self.activity = try Activity.request(attributes: activityAttributes, contentState: initialContentState) // Play audio so app can keep running in the background.       try playAudio()     } catch (let error) {       print("Error requesting Live Activity \(error.localizedDescription).")     }   } private func playAudio() throws {     try AVAudioSession.sharedInstance().setCategory(.playback, options: .mixWithOthers)     try AVAudioSession.sharedInstance().setActive(true)     if self.player == nil {       if let url = Bundle.main.url(forResource: "Forest", withExtension: "m4a") {         player = try AVAudioPlayer(contentsOf: url)         player?.numberOfLoops = -1       }     }     player?.stop()     player?.currentTime = 0     player?.play()   } after the timer stops, the code will execute, but the live activity won't get updated.   func updateActivity(){     Task {       if let activity = self.activity { // Prepare content state         await activity.update(using: contentState)       }     }   }
Posted Last updated
.
Post not yet marked as solved
1 Replies
656 Views
Hello there, I am trying to follow along with the video and copy the example shown here in SwiftUI. I am given the error Cannot assign value of type 'UIView' to type 'PKCanvasView?' on this line: resultView = overlayView It is totally possible that I am botching the whole thing up but I would appreciate it if someone looked over my code. Thanks. code: // ContentView.swift import SwiftUI import PDFKit import PencilKit import Foundation import UIKit struct PDFUIView: View { let pdfDoc: PDFDocument let pdfView: PDFView init() { let url = Bundle.main.url(forResource: "example", withExtension: "pdf")! pdfDoc = PDFDocument(url: url)! pdfView = PDFView() } var body: some View { VStack { PDFKitView(showing: pdfDoc, pdfView: pdfView) } .padding() } } #Preview { PDFUIView() } struct PDFKitView: UIViewRepresentable { let pdfDocument: PDFDocument let pdfView: PDFView init(showing pdfDoc: PDFDocument, pdfView:PDFView) { self.pdfDocument = pdfDoc self.pdfView = pdfView } func makeUIView(context: Context) -> PDFView { pdfView.usePageViewController(true) pdfView.autoScales = true pdfView.pageOverlayViewProvider = context.coordinator pdfView.displayMode = .singlePageContinuous pdfView.isUserInteractionEnabled = true pdfView.document = pdfDocument pdfView.delegate = context.coordinator return pdfView } func updateUIView(_ pdfView: PDFView, context: Context) { pdfView.document = pdfDocument } func makeCoordinator() -> Coordinator { Coordinator() } } class Coordinator: NSObject, PDFPageOverlayViewProvider, PDFViewDelegate { var pageToViewMapping = [PDFPage: UIView]() func pdfView(_ view: PDFView, overlayViewFor page: PDFPage) -> UIView? { var resultView: PKCanvasView? = nil if let overlayView = pageToViewMapping[page] { resultView = overlayView } else { var canvasView = PKCanvasView(frame: .zero) canvasView.drawingPolicy = .anyInput canvasView.tool = PKInkingTool(.pen, color: .systemCyan, width: 20) canvasView.backgroundColor = UIColor.clear pageToViewMapping[page] = canvasView resultView = canvasView } let page = page as! MyPDFPage if let drawing = page.drawing { resultView?.drawing = drawing } return resultView } func pdfView(_ pdfView: PDFView, willDisplayOverlayView overlayView: UIView, for page: PDFPage) { guard let overlayView = overlayView as? PKCanvasView else { return } guard let canvasView = pageToViewMapping[page] else { return } let page = page as! MyPDFPage page.drawing = overlayView.drawing pageToViewMapping.removeValue(forKey: page) } class MyPDFAnnotation: PDFAnnotation { override func draw(with box: PDFDisplayBox, in context: CGContext) { UIGraphicsPushContext(context) context.saveGState() let page = self.page as! MyPDFPage if let drawing = page.drawing { let image = drawing.image(from: drawing.bounds, scale: 1) image.draw(in: drawing.bounds) } context.restoreGState() UIGraphicsPopContext() } } class MyPDFPage: PDFPage { var drawing: PKDrawing? } }
Posted
by xcodegeek.
Last updated
.
Post marked as solved
1 Replies
576 Views
I am using the same ChannelManager, however when switching to another channel, I leave the channel and then requestToJoin a new channel with the new inputted ChannelName and PTDescriptor however the name and image are not changing when I go to background to see the Native UI. Am I missing something to call for an update for the PTDescriptor?
Posted
by lotam.
Last updated
.
Post not yet marked as solved
0 Replies
509 Views
The child views of my container need to get (not set) the size of container from within their view body, in order to perform some calculation. I've made a custom container that conforms to the Layout protocol (the actual implementation isn't important). The required placeSubviews method has a bounds parameter which is the size of container. Does anyone know whether it is possible to store the bounds somewhere so that the subviews of the container can access it, such as in environment key of the view hierarchy? Or is my only option to use a GeometryReader? struct CustomContainer: Layout { func sizeThatFits(proposal: ProposedViewSize, subviews: Subviews, cache: inout Void) -> CGSize { // Calculate and return the size of the layout container. } func placeSubviews(in bounds: CGRect, proposal: ProposedViewSize, subviews: Subviews, cache: inout Void) { // Tell each subview where to appear. // Can I store bounds parameter somewhere?? } }
Posted
by Uasmel.
Last updated
.
Post not yet marked as solved
1 Replies
992 Views
Hello! Since the Stage Manager now occupies a set of interactions left of the screen to display a new dock, we'd like to conditionally turn off features or provide a feature workaround to not conflict with this set of interactions. Is there any way to check in AppKit, whether Stage Manager is currently turned on in the system to accomplish this? Any help would be greatly appreciated!
Posted
by sheb.
Last updated
.