Post not yet marked as solved
I have a question about an app using PencilKit.
I would like to add a UIImageView that is addSubviewed on top of PKCanvasView to PKDrawing and retrieve it as data along with other drawing information using dataRepresentation().
Please tell me how.
Post not yet marked as solved
I have a question about an app using PencilKit.
I would like to add a UIImageView that is addSubviewed on top of PKCanvasView to PKDrawing and retrieve it as data along with other drawing information using dataRepresentation().
Please tell me how.
Post not yet marked as solved
Has anyone got PKToolPicker appearing in an iPad app running in compatibility mode on Vision Pro?
In our app, it appears fine on iPad, but not in the Vision Pro simulator.
Even though the tools do not appear, I am able to draw in the canvas (...though not change the pen, of course).
I did not read anywhere that it was not supported on Vision Pro. I only saw that interactions with pencil do not work, but that other forms of interaction (eg drawing with touch) should work.
Anyone seen it working on Vision Pro?
Post not yet marked as solved
I am making a code that uses pencilKit and you can draw on the canvas. However, I want to be able to detect if the resulted drawing is in contact with a Rectangle().
Is there any way to do this?
I dont want to use variables for X and Y positions because I have more than 400 rectangles in a grid.
Post not yet marked as solved
I've noticed in FreeForm app drawing stroke is very nice, like vector object, is there any chance to get the same result of drawing stroke quality like in the FreeForm app ?
Post not yet marked as solved
Is there any chance to modify the strokes of PKDrawing while drawing in on-process ? since I notice in canvasViewDrawingDidChange is called after drawing stroke is finished
My objective is to get realtime feedback modification of PKStroke
Thank you in advance
Post not yet marked as solved
In 2 days we have observed in Crashlytics over 50 crashes related to PKDrawing.image with EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000210
So far all on iPads with A10, A12 and A13; 100% on iOS 17 (17.1.1, 17.2, 17.3) (while in others the percent of iOS 17 was around 60-80%).
Context:
Images of the varying frame and scale resulting in screen resolution are being generated sequentially (in a background serial queue called almost one after another when requested), updating one CALayer.contents (and only after this update on Main Thread the next generation is allowed). One zoomable PKCanvasView is present on screen.
The crashing line in code:
let image = drawing.image(from: frame, scale: renderScale).cgImage
The questions:
Is there anything that can be done apart from throttling generation?
Can the circumstances of the crash be determined – are there any indications accessible in code before calling PKDrawing.image that app might crash?
The traces:
0
AGXMetalA12
AGX::BlitContext<AGX::G11::Encoders, AGX::G11::Classes, AGX::G11::ObjClasses>::copyTextureToBuffer(IOGPUMetalResource const*, unsigned long, unsigned long, unsigned long, AGXA12FamilyTexture*, unsigned int, unsigned int, MTLOrigin, MTLSize, unsigned long) + 96
9
PencilKit
PKDrawing.image(from:scale:) + 28
0
AGXMetalA13
<redacted> + 96
9
PencilKit
PKDrawing.image(from:scale:) + 28
0
AGXMetalA10
<redacted> + 72
9
PencilKit
$s9PencilKit9PKDrawingV5image4from5scaleSo7UIImageCSo6CGRectV_12CoreGraphics7CGFloatVtF + 24
Post not yet marked as solved
Hello!
I've been teaching myself Swift and wanted to challenge myself by creating a drawing app. I want to add a feature that allows the user to share the drawing on their canvas, but I'm having some difficulty. I tried using a ShareLink, but it's asking that I conform in to Transferable. How do I do that?
import SwiftUI
import PencilKit
struct ContentView: View {
@State private var canvasView = PKCanvasView()
var body: some View {
Text("Let's draw!")
GeometryReader { geometry in
VStack {
Spacer()
HStack {
Spacer()
PencilKitView()
.frame(width: geometry.size.width * 1, height: geometry.size.height * 0.7)
Spacer()
}
Spacer()
}
}
//share button
Button("share with a friend") {
let portionRect = CGRect(x: 0, y: 0, width: 100, height: 100)
let scale: CGFloat = 1.0
let image = canvasView.drawing.image(from: portionRect, scale: scale)
ShareLink(
item: image,
preview: SharePreview(
"Share Preview",
image: image
)
)
}
}
}
struct PencilKitView: UIViewRepresentable {
typealias UIViewType = PKCanvasView
let toolPicker = PKToolPicker()
func makeUIView(context: Context) -> PKCanvasView {
let pencilKitCanvasView = PKCanvasView()
pencilKitCanvasView.drawingPolicy = PKCanvasViewDrawingPolicy.anyInput
toolPicker.addObserver(pencilKitCanvasView)
toolPicker.setVisible(true, forFirstResponder: pencilKitCanvasView)
pencilKitCanvasView.becomeFirstResponder()
return pencilKitCanvasView
}
func updateUIView(_ uiView: PKCanvasView, context: Context) {
}
}
#Preview {
ContentView()
}
Post not yet marked as solved
Hi,
I have an app using PencilKit that works on VisionOS 1.0. It means, a user can pick an inking tool from PKToolPicker and draw on PKCanvasView. The app is now on available on Vision Pro AppStore as well.
However, when I test the app on VisionOS 1.1 RC Simulator, I can pick an inking tool but when I try to draw on the canvas, it just scrolls and no drawing appears on the PKCanvasView.
I also noticed that the VisionOS 1.0 Simulator has the FreeForm app where you can draw with PencilKit but the VisionOS 1.1 RC Simulator does not have the FreeForm app.
Is this a known issue? Will it be fixed before the release or is there a change in API so I can update the app accordingly?
Thanks
Zafer
Post not yet marked as solved
Hi, I am trying to make a simple note taking app that users can draw something on pdfview with apple pencil.
(I used PDFKit, PencilKit for the code.)
I followed the instruction code of WWDC22's "What's new in PDFKit." - "overlayProvider"
(so you can see the code at the video.) I was able to draw something each view of pdf page.
But the issue was, the resolution of overlayview or subview of pdfview is low.
As far as I know, the pkcanvasview draws vertor-based drawings. So I never thought the image or the lines I draw will be that blurry.
Is this buggy or is this the normal thing? (+ I added a uibutton as subview of pdfview and the button also looks blurry.)
I even tried to scale up the all the subviews when the subviews' layout is done, using contentScaleFactor.
PKCanvasView inherits UIScrollView, so I enlarged the frame of pkcanvas view and fixed the scale to below 1.0. If the pkcanvasview looks blurry and that is because somewhat zoomed in wrong way, zooming out should be the solution. But, didn't work. Still blurry.
and any other stuff like changing frame or size.
So, anyone having same problem with me, or anyone can give me any solution.
Please help me. I wish this is bug thing that can be fixed in any moment.
-> This image is little bit zoomed in. but the drawing is blurry.
and this is the normal pkcanvasview drawing, just subview of view(of VC).
Post not yet marked as solved
When PKCanvasView is first drawn, existing drawing objects disappear.
PKDrawing, which previously had drawings saved with PKCanvasView, was saved as separate data or file.
After that, while creating a new PKCanvasView, I loaded the saved PKDrawing and reflected it in the new PKCanvasView.
Below is the code.
canvasView!.drawing = draw!
Previously saved text or lines will be displayed normally.
However, when I draw with the pen for the first time (when I touch the screen with the pen), the old writing or lines disappear.
And after I leave a line or text, if I pan across the screen, the old text or line appears again.
If this phenomenon occurs and you touch the screen again with the fan, this phenomenon will no longer occur.
This phenomenon occurs the first time when a new PKCanvasView is declared and the previously saved PKDrawing is reflected.
Could you please help me with why this phenomenon occurs and how to improve it?
Post not yet marked as solved
I wonder if an Apple engineer could confirm: will the Apple Pencil Pro squeeze functionality be detectable in the current API, or will this be a future iPadOS extension to gesture recognizers / UIKit? I’d like to start playing with the functionality if it’s detected behind an existing event though. (Long press?)