I want to have realtime image anchor tracking together with RoomPlan.
But it's frustrating to not seeing any thing that can support this.
Because it is useful to have interactive things in the scanned room.
Ideally it should be running the same time, but if not possible, how do you align the two tracking space if running RoomPlan and then ARKit image tracking? sounds like headache
Post not yet marked as solved
how can we change the image quality, size, camera and cadence used during RoomPlan's scanning. We are getting the images from RoomCaptureSession.
Post not yet marked as solved
Error:
RoomCaptureSession.CaptureError.exceedSceneSizeLimit
Apple Documentation Explanation:
An error that indicates when the scene size grows past the framework’s limitations.
Issue:
This error is popping up in my iPhone 14 Pro (128 GB) after a few roomplan scans are done. This error shows up even if the room size is small. It occurs immediately after I start the RoomCaptureSession after the relocalisation of previous AR session (in world tracking configuration). I am having trouble understanding exactly why this error shows and how to debug/solve it.
Does anyone have any idea on how to approach to this issue?
Post not yet marked as solved
Is there a way to handle Roomplan ExceededSceneSizelimitError. in documentation also there is nothing . For me after taking just small small scans about 5 to 6 after the 7 th scan getting this error . only killing the app and restarting is working, if i come to back page and again go to start newscan also it is not working
Im working on Apple RoomPlanAPI MergescansFeature which is an update of ios17.
In this it is said mergescans can be done in 2 ways
continuous ARsession
ARRelocalization
I've worked on ARRelocalization . I take a first scan and then come nextday and want to take another scan . can we start the scan anywhere in that particular house. and can merge them. Because currently what I was doing was I take a scan and when i stop the scan ..then save the world map like the snapshot/anchors of that point where I've stopped the scan and again come nextday and start a scan then load the saved map... it says we should go where we've stopped the last scan then only it will sense the environment and it relocalizes with the previous scan coordinatesystem.
now we can take a scan and these 2 can be merged.
but now what i want was i will take a scan and due to some reasons i didn't take the bedroom of the house now I'll come next day and want to take only the bedroom scan can i start it from the bedroom scan directly without going to the last point where we stopped the scan and merge these two scans? is it even possible?
Post not yet marked as solved
We scan the room using the RoomPlan API, and after the scan, we obtain objects with a white color along with shadows and shading. However, upon updating the color of these objects, we experience a loss of shadows and shading.
RoomPlan scan
After Update
Post not yet marked as solved
Has anyone had success with RoomPlan to scan an entire home? We use floor plan scanning technology for our real estate business and RoomPlan is great as it has instant results but we need the entire home with a staircase for total square footage purposes.
Thanks in advance,
Luke
Post not yet marked as solved
Hi
Is this possible to have RoomCaptureSession and ARSession together, as we need feature points
Post not yet marked as solved
Hey guys, I hope you are all doing well! I am really stuck on a problem and I need help here. I am new to Swift but I need to use Swift to complete a project. I am trying to use this code https://github.com/jmousseau/RoomObjectReplicatorDemo/blob/main/RoomObjectReplicatorDemo/ViewController.swift with the RoomObjectReplicator library https://github.com/jmousseau/RoomObjectReplicator. However, the issue is that my app is currently using SwiftUI and not UIKit. Furthermore, I have ContentView and ARViewContainer. How can I implement the code but use ContentView and ARViewContainer along with SwiftUI? Thanks in advance!
Post not yet marked as solved
A case occured where the Structure Builder is crashing with EXC_BAD_ACCESS and the error cannot be handled without the app crashing.
I have two minimalistic models, one even reduced to the minimum of the "coreModel" itself. (See attachment)
Each model alone in the StructureBuilder works fine.
Using both causes the crash.
Has anyone found a way to handle this error without the app crashing?
override func viewDidLoad() {
super.viewDidLoad()
let capturedRooms: [CapturedRoom] = [
loadCapturedRoom(fileName: "appleModel1"),
loadCapturedRoom(fileName: "appleModel2")
]
buildStructure(capturedRooms)
}
func buildStructure(_ capturedRooms: [CapturedRoom]) {
let structureBuilder = StructureBuilder(options: [])
Task {
print("----- START BUILDING STRUCTURE -----")
do {
let capturedStructure = try await structureBuilder.capturedStructure(from: capturedRooms)
} catch {
print("----- FAILED BUILDING STRUCTURE -----")
}
// Crashing with: EXC_BAD_ACCESS
// This part will never be reached
print("----- FINISH BUILDING STRUCTURE -----")
}
}
func loadCapturedRoom(fileName: String) -> CapturedRoom {
do {
guard let jsonFileURL = Bundle.main.url(forResource: fileName, withExtension: "json") else {
fatalError("JSON file not found.")
}
let data = try Data(contentsOf: jsonFileURL)
return try JSONDecoder().decode(CapturedRoom.self, from: data)
} catch {
fatalError(error.localizedDescription)
}
}
appleModel1.json
appleModel2.json
Post not yet marked as solved
when i start roomplan scanner,there reminder "Move device to start",and other reminder, now i use roomplan in china, i want use custom remider in my apps, how should i do?
Post not yet marked as solved
Hi,
I want to capture the real world colors and textures of the objects into RoomPlan scans. and export the model as USDZ. Is there a way to achieve this?
Post not yet marked as solved
I am currently working on a project involving the merging of multiple scans into a single structure using Room plan json file. I am facing an issue and would greatly appreciate your assistance in resolving it.
The problem I am encountering is as follows:
"Cannot process multiFloorPlan: Invalid room location in structure"
Captured room json file paths:
[https://mimesisfbs.s3.ap-south-1.amazonaws.com/project_3dfile_json/2088a003-5712-4e9f-91eb-601b145ca98e-project_3dfile_json_file.json)
[https://mimesisfbs.s3.ap-south-1.amazonaws.com/project_3dfile_json/10ccd0d9-7843-41bd-a021-a57781231059-project_3dfile_json_file.json)
We are using same code as in example.
[https://developer.apple.com/documentation/RoomPlan/merging_multiple_scans_into_a_single_structure)
Post not yet marked as solved
Hi all.
Can anyone show the code for multiroom on swiftUI?
And another question, is it possible to change the parameters (dimensions) of walls, doors, windows in an already scanned room by entering real dimensions from a laser tape measure? Because the scanned object has dimensional errors of up to 3 cm.
Thanks.
Post not yet marked as solved
I'm looking for the full sample code from WWDC23 session https://developer.apple.com/videos/play/wwdc2023/10192
Post not yet marked as solved
Hello,
we are using the RoomPlan API and our users are facing issues during scanning with more than 10% frequency rate.
The errors are of different type but aren't really suiting the context. For example, the error message "Roomsize exceeded" (translated from German) is popping up even though the room is relatively small. Bigger rooms are not a problem.
We would like to know what is triggering errors exactly so that we can reproduce it. We are going to build workarounds for it, like snapshotting the ARSession to continue later ideally.
Unfortunately on errors the ARSession is ended and all data scanned is lost.
Has anyone else encounters this?
Post not yet marked as solved
We have a 'roomplan' API object in a USDZ file that we need to display as a mesh object. Please suggest how to convert it.
Post not yet marked as solved
I am in need to create a 2D floor plan with the dimensions mentioned, from the generated 3D result of RoomPlan. Is there a way to create it in a little easy-to-understand manner? Or will it require manual elaborate coding?
Post not yet marked as solved
Our app is available in the App Store and it's working well on ios16 devices.
A few days ago, we noticed in Organizer weird bugs coming from ios17, with the only hint "NO_CRASH_STACK".
After installing ios17 on an iPhone, we were able to reproduce the crash directly at launch, but only when the app is downloaded from the appstore (no crash when the app is installed with Xcode 15 beta)
"type": "EXC_CRASH",
"signal": "SIGABRT"
},
"termination": {
"code": 4,
"flags": 518,
"namespace": "DYLD",
"indicator": "Symbol missing",
"details": [
"(terminated at launch; ignore backtrace)"
],
"reasons": [
"Symbol not found: _$s8RoomPlan0A14CaptureSessionCACycfc",
"Referenced from: <XXXX----XXXXXXX> /Volumes/VOLUME//.app/",
"Expected in: <XXXX--**-XXXXX-XXXXXXX> /System/Library/Frameworks/RoomPlan.framework/RoomPlan"
]
Does Anybody else encounter this issue?
What should we do to solve this?
thanks!
Post not yet marked as solved
I'm unable to debug on iPhone 12 running iOS 17 beta 3 via network. I'm running Xcode 15 beta 6
Device shows in devices and simulators and I can debug when connected with cable.
However the "Connect Via Network" option is frayed out, oddly however the checkbox is ticked
I'm developing a app using RoomPlan so network connectivity is a must for debugging
Anyone else encountered this and know how to get around this problem
Other devices running iOS 16 connect via network just fine