Interaction Design

RSS for tag

Create engaging ways for users to interact with your software.

Posts under Interaction Design tag

8 Posts
Sort by:
Post not yet marked as solved
0 Replies
229 Views
Hello, I've a team for developing game in my small company. And we've developed an Obstacle game in Unity from scratch. every single UI, logic & even sound is implemented by our own developers and musicians. It means, every single element is proprietory of our own. I suggest you folks to try out the game in Android playstore by searching the game name as "Cherry Blossom Hills Obstacle" . But once I submitted the same game in iOS App store, the reviewers continously saying, it's a 4.3.0 Design Spam without any to the point feedback basically :( Could you anyone help me resolving this issue? A definitive but even single help/suggestion would be highly appreciated. Regards, Md. Rezoanul Alam.
Posted Last updated
.
Post not yet marked as solved
0 Replies
296 Views
Hello! I need advice. According to Apple's guidelines, is it permissible to ask users during the initial app launch, through an OnBoarding screen (example in the screenshot), if they are older than 16 when the Age Rating on the App Store is set to 9+? I want to determine if the user from Europe and the UK has reached the age of consent in order to display or not display the GDPR consent form to them. Thanks!
Posted
by zeil.
Last updated
.
Post not yet marked as solved
0 Replies
405 Views
Currently, there seems to be an all or nothing approach to supporting rotation on iPhone. Either every screen in your UI supports rotation, or none of them do. For a some apps however, that approach won't work. They have a number of screens that don't adapt well to a super letterboxed screen size, and a number of others that would benefit from the additional screen space. Previous discussion on this issue recommends the use of size classes, but this advice fails to recognise that some use cases simply aren't suited to being super letterboxed. Apple's own UI design is tacit acknowledgement of this: For example, the main UI of the Camera app stays fixed in the portrait orientation in the shooting mode, but presents a rotatable modal to review photos and videos. Even Springboard, the home screen of the iPhone, remains locked in the portrait orientation whilst allowing an app to be presented in landscape. Social media and news apps are another example: generally anchored around a portrait newsfeed that doesn't adapt well to extreme letterboxing, but surfacing rich media such as images, videos, charts and other interactive elements that could use the flexibility of landscape presentation. (News app, looking at you.) Is it time to re-visit the rotation characteristics of the phone vs. tablet idioms? Is this all-or-nothing approach to rotation serving the platform well? Regardless, app designers at Apple and elsewhere are creating apps that use this hybrid approach to rotation. And as things stand today, SwiftUI makes it very difficult. A rough equivalent can be made using a ZStack and observing the device orientation, but this requires hiding the status bar and provides no way to honor a user's portrait lock settings. The only other option, as far as I can tell, is building the app using UIKit view controllers, to thread through supportedInterfaceOrientations hooks. Personally, what I'd love to see is a new presentationInterfaceOrientations(_:) hook on View, that allows a fullScreenCover presentation to be specified as supporting an alternative orientation set. This could be iPhone only, and should serve the majority of use cases. However, in the meantime, it would be great to know if there's a technique that can get the UIKit behavior in a SwiftUI app that doesn't require rewriting the entire container view hierachy in UIKit.
Posted
by tcldr.
Last updated
.
Post marked as solved
2 Replies
1.7k Views
Hello all, I would like to understand how to create a SwiftUI View like the official camera app. When the device orientation changes, the view is not animating, and the buttons just rotate (4:3, 1x...). The camera app view is compose by flash and live buttons, camera preview, config buttons, and big button to shot the photo. In portrait, it is from top to bottom, in landscape, from left to right. Also, when the last pictures view is shown, it is adapted to the current orientation, like if the camera preview view was rendered in the same device orientation. Ideas? Thanks!
Posted
by heltena.
Last updated
.
Post not yet marked as solved
0 Replies
681 Views
I'm currently working on a project where on iOS the user needs to be able to tap and drag on a SwiftUI Canvas to pan (via a DragGesture()), but macOS should be able to use two fingers to pan, and I'm not sure how to do that. I've looked around some, and found a solution that doesn't work, which is to combine a DragGesture with a MagnificationGesture/MagnifyGesture, but it's far from functional. I've provided some code below for just the macOS part. import SwiftUI struct InfiniteCanvasV: View { @State private var scale: CGFloat = 1.0 @State private var translation: CGSize = .zero var body: some View { let gesture = SimultaneousGesture( MagnificationGesture() .onChanged { value in self.scale = value print("Scaling") }, DragGesture(minimumDistance: 0, coordinateSpace: .global) .onChanged { value in self.translation = value.translation print("Translating") } ) .onChanged { _ in print("Gesture changed") } return ZStack { Canvas { context, size in let scaledSize = CGSize(width: size.width * (scale) * 0.75, height: size.height * (scale) * 0.75) let xOffset = (size.width - scaledSize.width) / 2 let yOffset = (size.height - scaledSize.height) / 2 let scaledRect = CGRect(origin: CGPoint(x: xOffset, y: yOffset), size: scaledSize) context.stroke( Path(roundedRect: scaledRect, cornerSize: CGSize(width: 10, height: 10)), with: .color(.green), lineWidth: 4 ) } .gesture(gesture) .frame(width: 300, height: 200) .border(Color.blue) Text("\(scale)") .allowsHitTesting(false) } } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
503 Views
I have a safety system app that allows people to activate a help alert and notify selected contacts. I want to link the activation of the help alert in my app, so it triggers the existing Apple Emergency SOS process. Once SOS is activated, it auto calls emergency services in the jurisdiction the user is located in. Ref https://support.apple.com/en-us/HT208076 In my app the help alert is activated through a two step manual process, so a user has to consciously activate it. There is no interference with Apple processes, simply another pathway to activation. Question Is there documentation available for this proposed process? Signposting or any help is most welcome.
Posted Last updated
.
Post not yet marked as solved
1 Replies
2.3k Views
Hi - on iPadOS16 beta1, we have been experimenting with connecting to external 1080p and 4k monitors using TypeC on latest M1 iPads. It appears to only display full-screen (non-mirrored) when there is a HID device such as mouse connected. This makes sense as you need to be able to control the screen independently. However for a touch screen external display (e.g. INNOCN) that supports multi-touch, it does not work. iPadOS (and macOS) do not have any multi-touch drivers. Is there plans for iPadOS (and indeed eventually macOS) to support external multi-touch screens?
Posted
by Shagpile.
Last updated
.
Post not yet marked as solved
10 Replies
4k Views
I have an app which requires the user to have direct interaction with a view when VoiceOver is enabled. In iOS 12 and 13 my code worked as expected. However, in iOS 14, whenever the user tries to interact with the view requiring direct interaction, the view responds with the audible message: "Direct touch area. Use the rotor to enable direct touch for this app." This is a bit of an awkward user experience for a visually impaired user. The user now has to enable something which was working just fine in previous versions. In addition, anytime the user navigates away from the app and then returns, they must re-enable direct touch again. In my code I was able to enable direct interactions with the following code: view.isUserInteractionEnabled = true view.isAccessibilityElement = true view.accessibilityTraits = UIAccessibilityTraits.allowsDirectInteraction However, this no loner seems to be enough. How do I tell my app: "Yes, please use direct interaction all the time and don't bother the user?"
Posted
by bb1999.
Last updated
.