Post not yet marked as solved
Hi all,
I have an .obj model which I converted using Reality converter.
When trying to load it in safari quicklook, it pops up an error "Object requires a newer version of iOS"
I'm on an iPhone X with iOS 13.6.
Any thoughts?
Thanks
I'm trying to create a custom Quick Look preview on macOS. I've found the Quick Look Preview Extension target, which is brilliant, and does most of the 'heavy' lifting, but I've run into a few problems.
I'm implementing a preview for MIDI files (which has been missing since 2009...) using AVMIDIPlayer.
The player keeps playing when the file is no longer selected! What's the mechanism for fixing that? Some sort of check that the view exists..?
I notice that the OS preview for audio files has a different interface for the Finder's preview column and for the QuickLook 'pop-up' window. Again, I can't see how you define different views for those two environments.
Is there any documentation that's specifically "Mac"? I can only find iOS stuff. (Same for third-party tutorials.)
Post not yet marked as solved
I am really suffering about this exercise, but don't know what I am doing wrong(
attached screenshots bellow:
Post not yet marked as solved
I'm working on a QuickLook extension for the file extension snd.
This extension was used by multiple companies in the 90s most notably by Sun.
https://en.wikipedia.org/wiki/Au_file_format
However it was also used by companies manufacturing samplers and synthesiser.
I'd like to preview files from the latter, however I was not able to make my extension run when these files are previewed.
To verify my extension works I added an exported UTI with the file extension dmf which is completely owned by my app and preview and opening the files work as expected.
However for the snd extension the system always uses the same preview (I assume QuickTime) and it doesn't matter if I associate my app with all files using the file extension.
I'm assuming since it's impossible to tell just from the filename what kind of file this is the system just simply assumes it's a Sun file format and uses one of the first part previews, QuickTime or maybe Audio.qlgenerator.
Is there any way I could override this behaviour just for this one file extension?
(As a side note I guess I could try the older QLGenerator route as well, but I recall not being able to play back audio files with those APIs no matter what I tried)
Any help would be welcome.
Post not yet marked as solved
My iOS app has a Quick Look extension. Will this Quick Look extension be available in visionOS when run as an iPad app in the Shared Space and/or if I rebuild using the visionOS SDK?
Post not yet marked as solved
Just getting familiar with XCode. Using Reality Composer a lot now. Ready to try coding along with with Reality Composer.
Saw this demo (see link below), but I don't want to use a web server to retrieve banner information, I would prefer to embed this information directly into the USDZ file to be read with AR Quick Look.
**Two questions: **
How can you get a banner like this when you open an USDZ file and edit the banner information directly (within the file itself) without using a URL?
In place of the call to action button (for Apple Pay) in the demo below, I'd like to use that button to either call a phone number, send a text, or go to a web URL.
Link to Apple's example with Apple Pay (see custom examples section, like for the kids' slide example on that page).
https://developer.apple.com/augmented-reality/quick-look/
Scraps are welcome, hungry to learn.
Post not yet marked as solved
I’ve poured over the session “Discover Quick Look for spatial computing” and I was really impressed about “Windowed Quick Look” and how items can be opened in their own Volume and stay open even if the app/website they were opened from was closed.
I had an additional question–how long do items in “Windowed Quick Look” remain in the Shared Space after the app or web page they were opened from is closed?
I’m imaging something like a how-to document or diagram that users could be consulting visually but not interacting with, will visionOS purge it from memory at some point or will it persist indefinitely until the user manually closes them?
Same example if I was using a how-to document or diagram to help me work or learn alongside the app/site I was working or learning in, it would be more convenient if it was still open so I could continue right where I left off.
Therefore, if a user were to take off the Vision Pro for the evening and then put it back on in the morning, would the item they opened in Windowed Quick Look persist alongside the other apps/windows/volumes they had open and were working with in the Shared Space?
Post not yet marked as solved
I would like to see the entire code for the tutorial used in this video
Post not yet marked as solved
I am using QLPreviewController with SwiftUI using UIViewControllerRepresentable. If I try to delete or insert pages of large size PDF, QLPreviewController is not calling delegate methods (didUpdateContentsOf, didSaveEditedCopyOf).
struct QuickLookController: UIViewControllerRepresentable {
@Environment(\.dismiss) var dismiss
weak var delegate: QLPreviewControllerDelegate?
weak var dataSource: QLPreviewControllerDataSource?
func makeUIViewController(context: Context) -> UINavigationController {
let controller = context.coordinator.controller
controller.delegate = delegate
controller.dataSource = dataSource
controller.navigationItem.rightBarButtonItem = context.coordinator.dismissButton
return UINavigationController(rootViewController: controller)
}
func updateUIViewController(_ viewController: UINavigationController, context: UIViewControllerRepresentableContext<QuickLookController>) { }
func makeCoordinator() -> Self.Coordinator {
.init(parent: self)
}
@MainActor
class Coordinator: NSObject {
var parent: QuickLookController
init(parent: QuickLookController) {
self.parent = parent
}
lazy var controller: QLPreviewController = {
let controller = QLPreviewController()
return controller
}()
lazy var dismissButton: UIBarButtonItem = {
let button = UIBarButtonItem(
title: NSLocalizedString("Done", comment: ""),
style: .plain,
target: self,
action: #selector(rightButtonTapped(_:))
)
button.tag = 2
return button
}()
@objc func rightButtonTapped(_ sender: Any) {
controller.dismiss(animated: true)
}
}
}
// MARK: QuickLook
extension ViewerModel: QLPreviewControllerDataSource, QLPreviewControllerDelegate {
public func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
1
}
public func previewController(
_ controller: QLPreviewController,
previewItemAt index: Int
) -> QLPreviewItem {
let title = self.document
.documentURL?
.lastPathComponent ?? ""
let url = PDFManager
.directory
.appendingPathComponent(title) as NSURL
return url as QLPreviewItem
}
public func previewControllerDidDismiss(_ controller: QLPreviewController) {
}
public func previewControllerWillDismiss(_ controller: QLPreviewController) {
}
✔️ It's same even if I set it to updateContents
public func previewController(_ controller: QLPreviewController, editingModeFor previewItem: QLPreviewItem) -> QLPreviewItemEditingMode {
.createCopy
}
✔️ Not called with Large Size PDF
public func previewController(_ controller: QLPreviewController, didUpdateContentsOf previewItem: QLPreviewItem) {
}
✔️ Not called with Large Size PDF
public func previewController(_ controller: QLPreviewController, didSaveEditedCopyOf previewItem: QLPreviewItem, at modifiedContentsURL: URL) {
}
}
Post not yet marked as solved
Hi,
I watched the WWDC23 session video, "Create 3D models for Quick Look spatial experiences."
https://developer.apple.com/videos/play/wwdc2023/10274/
In the video, I understood that the scale of models displayed using visionOS's AR Quick Look is determined by referencing the "metersPerUnit" value in USDZ files. I tried to find tools to set the "metersPerUnit" in 3D software or tools to view the "metersPerUnit" in USDZ files, but I couldn't find any. I believe adjusting the "metersPerUnit" in USDZ is crucial to achieve real-world scale when displaying models through visionOS's AR Quick Look. If anyone knows of apps or tools that can reference USDZ's "metersPerUnit" or 3D editor apps or tools that allow exporting with the "metersPerUnit" value properly reflected, I would greatly appreciate the information.
Best regards.
Sadao Tokuyama
https://twitter.com/tokufxug
https://www.linkedin.com/in/sadao-tokuyama/
Post not yet marked as solved
Hi,
I’ve implemented an ARKit app that display an usdz object in the real world. In this scenario, the placement is via image recognition (Reality Composer Scene)
Obviously when I don’t see the image (QR marker), the app could not detect the anchor and it will not place the object in the real world.
Is it possibile to recognize an image (QR marker) and after placing the object on it, leave the object there ?
So basically
detect the marker
place the object
leave the object there, not depending on the image (marker) recognition
Thanks
Post not yet marked as solved
Is it possible to somehow support copy/update edit modes when using quickLookPreview modifier, like in QLPreviewController? After pressing 'Done' it only allows me to discard or save to files instead.
Post not yet marked as solved
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience.
That works really well.
I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook.
Am I missing something, or is it no longer possible to export .reality files?
Thanks.
Post not yet marked as solved
QuickLook broken on iOS17, does not open reality file.
After I uninstalled Xcode and its CLI tools from my MacBook (Intel based, Ventura 13.5.2) the operating system seems to have forgotten how to handle Markdown files. I can still open them using VSCode or TextEdit but when I preview them using space they just show the file icon. My major issue with this is that I am using shortcuts to interact with them. They also stopped working on my MacBook exclusively. On both iPad and iPhone they still work. I appreciate any ideas on how to resolve this issue.
Post not yet marked as solved
I am showing PDF file in QLPreview Controller, in iOS17, after copy and paste the page of pdf and edit the original page using Pencil kit, will affect the duplicate page also. How to restrict?
Thanks to all
Post not yet marked as solved
Hello. I've started exploring the new features in Reality Composer PRO and noticed that Composer now supports adding custom scripts as components to any objects in the scene. I'm curious about the following: will these scripts work if I export such a scene to a USDZ file and try to open it using Apple Quick Look? For instance, I want to add a 3D button and a cube model. When I press the button (touch it), I want to change the material or material color to another one using a script component. Is such functionality possible?
Post not yet marked as solved
Hi,
I'm developing quick look extensions of my app's custom files to display previews and thumbnails in the finder. When I developed and debugged these extensions, they were listed on the "added extensions" in the System Settings > Privacy and Security > Extensions. And They worked.
But, they don't appear on the list when I made a package and install it on my mac or testing machine. And the quick look didn't work at all.
Should I configure build settings or packaging options to make them register(?) or work?
Post not yet marked as solved
HI, I'm new to IOS Dev.
I am developing an app with AR function. I found there are a few tutorials about AR Quick Look. However, they're all use storyboard. Is there any way to use swift ui to demonstrate AR Quick Look.
ContentView.swift
import SwiftUI
//import QuickLook
//import ARKit
struct ContentView: View {
@State private var isPresented = false
var body: some View {
VStack {
Button {
isPresented = true
print("click")
} label: {
Text("Click to AR")
.font(.title)
.fontWeight(.bold)
.padding()
.background()
.cornerRadius(16)
}
.sheet(isPresented: $isPresented) {
ARView()
}
.padding()
}
}
}
#Preview {
ContentView()
}
ARView.swift
import SwiftUI
struct ARView: UIViewControllerRepresentable {
func makeUIViewController(context: Context) -> QuickViewController {
QuickViewController()
}
func updateUIViewController(_ uiViewController: QuickViewController, context: Context) {
uiViewController.presentARQuickLook()
}
typealias UIViewControllerType = QuickViewController
}
QuickViewController.swift
import UIKit
import QuickLook
import ARKit
class QuickViewController: UIViewController, QLPreviewControllerDelegate,
QLPreviewControllerDataSource {
// 有幾個模型要呈現
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return 1
}
// 顯示模型
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
let url = Bundle.main.url(forResource: "bear", withExtension: "usdz")!
// Load file url
let preview = ARQuickLookPreviewItem(fileAt: url)
return preview
}
func presentARQuickLook() {
let previewController = QLPreviewController()
previewController.dataSource = self
present(previewController, animated: true)
print("Open AR model!")
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
// Get the new view controller using segue.destination.
// Pass the selected object to the new view controller.
}
*/
}
Post not yet marked as solved
Can a customized Pages document be created and integrated into an Apple Multiplatform App?