When I enter webp image to the assets catalog I get warrning:
/Users/..../Assets.xcassets The image set "Card-Back" references a file "Card-Back.webp", but that file does not have a valid extension.
It works, I see all my images perfect.
How can I fix the 200+ warrnings?
Overview
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
There is a week left for end of the Swift Student Challange submissions. I've built an app which uses FoundationModels but unfortunately there is no support for it in the Swift Playground. Is there any chance it might be coming up in the following days?
Topic:
Developer Tools & Services
SubTopic:
Swift Playground
Tags:
Swift Student Challenge
Swift Playground
in SwiftUI foundations: Build great apps with SwiftUI
Toolbar have navigationTitle with align leading.
I try to create same layout. but it fail
How was it possible?
Hi All, We are facing App Rejection from Apple due to this -
Guidelines 5.1.1(i) - Legal - Privacy - Data Collection and 5.1.2(i) - Legal - Privacy - Data Use
Issue Description
The app appears to share the user’s personal data with a third-party AI service but the app does not clearly explain what data is sent and identify who the data is sent to before sharing the data.
Apps may only use, transmit, or share personal data after they meet all of the following requirements:
Disclose what data will be sent
Specify who the data is sent to
Obtain the user’s permission before sending data
Identify in the privacy policy what data the app collects, how it collects that data, all uses of that data, and confirm any third party the app shares data with provides the same or equal protection
Next Steps
If the app sends user data to a third-party AI service, revise the app to explain what data is sent and identify who the data is sent to before sharing personal data with a third-party AI service.
If it does not already, the app’s privacy policy must also identify what data the app collects, how it collects that data, and all uses of that data, including if it is shared with a third-party AI service.
If the app does not send user data to a third-party AI service or does not include a third-party AI service, reply to this rejection to confirm and add this information to the App Review Information section of App Store Connect.
We have updated on privacy policy and ATT as well as Nutrition Labels and we have added a consent screen for explicitly taking the user consent for AI Services. However we are still seeing the rejection.
Has some else faced a similar issue and what are the steps they followed for this.
Hello,
I am building an iOS-only, commercial app that uses AVSpeechSynthesizer with system voices, strictly using the APIs provided by Apple. Before distributing the app, I want to ensure that my current implementation does not conflict with the iOS Software License Agreement (SLA) and is aligned with Apple’s intended usage.
For a better playback experience (more accurate estimation of utterance duration and smoother skip forward/backward during playback), I currently synthesize speech using:
AVSpeechSynthesizer.write(_:toBufferCallback:)
Converting the received AVAudioPCMBuffer buffers into audio data
Storing the audio inside the app sandbox
Playing it back using AVAudioPlayer / AVAudioEngine
The cached audio is:
Generated fully on-device using system voices
Stored only inside the app’s private container
Used only for internal playback controls (timeline, seek, skip ±5 seconds)
Never shared, exported, uploaded, or exposed outside the app
The alternative approaches would be:
Keeping the generated audio entirely in memory (RAM) for playback purposes, without writing it to the file system at any point
Or using AVSpeechSynthesizer.speak(_:) and playing speech strictly in real time which has a poorer user experience compared to my approach
I have reviewed the current iOS Software License Agreement:
https://www.apple.com/legal/sla/docs/iOS18_iPadOS18.pdf
In particular, section (f) mentions restrictions around System Characters, Live Captions, and Personal Voice, including the following excerpt:
“…use … only for your personal, non-commercial use…
No other creation or use of the System Characters, Live Captions, or Personal Voice is permitted by this License, including but not limited to the use, reproduction, display, performance, recording, publishing or redistribution in a … commercial context.”
I do not see a specific reference in the SLA to system text-to-speech voices used via AVSpeechSynthesizer, and I want to be certain that temporarily caching synthesized speech for internal, non-exported playback is acceptable in a commercial app.
My question is:
Is caching AVSpeechSynthesizer system-voice output inside the app sandbox for internal playback acceptable, or is Apple’s recommended approach to rely only on real-time playback (speak(_:)) or strictly in-memory buffering without file storage?
If this question falls outside DTS technical scope and is instead a policy or licensing matter, I would appreciate guidance on the authoritative Apple documentation or the correct Apple team/contact.
Thank you.
Here’s a recap of the Live Q&A for SwiftUI foundations: Build great apps with SwiftUI. If you participated and asked questions, thank you for coming and participating! If you weren’t able to join us live we hope this recap is useful
Where can I watch the VOD? Is the sample code “Wishlist” that was shown available for download?
You can view the replay of the entire event here https://www.youtube.com/watch?v=Z3vloOtZLkQ
The sample code for the Wishlist app will be made available in the coming weeks on the Apple Developer website, we'll send an update via email when it is available.
What are the best practices when it comes to building complex navigations in SwiftUI?
The developer website has documentation on navigation style best practices.
Explore navigation basics like NavigationStack and TabView to get a ground-up understanding. For documentation on navigation APIs see Navigation.
How can I integrate UIKit with my SwiftUI app? What about adding SwiftUI into my UIKit app?
See UIKit integration: Add UIKit views to your SwiftUI app, or use SwiftUI views in your UIKit app. Both UIKit and SwiftUI provide API to show a view hierarchy of the other.
For UIKit to SwiftUI, you would use UIViewControllerRepresentable.
For SwiftUI to UIKit, you would use UIHostingController.
Landmarks: Interfacing with UIKit walks you through step by step how to implement UIKit in SwiftUI with UIViewControllerRepresentable, and this WWDC22 video demonstrates UIHostingController, for those that want to add SwiftUI to their UIKit.
Does Wishlist feature a new iOS 26 font? How can I add custom fonts and text of my app?
We’re glad to hear many of you liked wide text shown in Wishlist, however, It is the default system font with some light SwiftUI styling! Check it out for yourself in the sample code when made available, and you can learn more about customizing fonts and text by seeing Font and Applying custom fonts to text.
Does Xcode have a dependency graph we can use to optimize our SwiftUI Views?
Xcode comes with Instruments. Instruments is the best way to figure out what is causing excessive updates and other issues with performance. That link provides direct tutorials and resources for how to use and understand. Previews also have many useful tools for analyzing SwiftUI views, for more info see Previews in Xcode
Check out this video from our latest WWDC Optimize SwiftUI performance with Instruments for information on how to use Instruments to profile and optimize your app with real-world applications
If you still have questions, Check out the Instruments section of these forums and create a post so the community has the opportunity to help guide you.
Are there UI debugging tools to help diagnose layout issues?
Yes, Xcode also features a View Debugger located by selecting the View Debug Hierarchy, pictured below. Use the View Debugger to capture and inspect your view hierarchy, identifying which views affect window sizing. The SwiftUI Inspector also lets you examine view frames and layout behavior.
See Diagnosing issues in the appearance of a running app to learn about debugging visual and layout issues.
As an absolute beginner, what would be the first go-to step to go for training? Do I need prior knowledge of frameworks to get started with SwiftUI?
A great place to learn how to develop for Apple platforms is with Pathways! Many developers start with Develop in Swift tutorials, which exposes you to several frameworks while teaching you the basics of SwiftUI. When you're ready to take your learning further, you can read the documentation for the specific frameworks that interest you at https://developer.apple.com/documentation/.
Topic:
UI Frameworks
SubTopic:
SwiftUI
When rendering an equirectangular video on a sphere using VideoMaterial and MeshResource.generateSphere(), there is a visible black seam line running vertically on the sphere. This appears to be at the UV seam where the texture coordinates wrap from 1.0 back to 0.0.
The same video file plays without any visible seam in other 360° video players on Vision Pro, so the issue is not with the video content itself.
Here is the relevant code:
private func createVideoSphere(content: RealityViewContent, player: AVPlayer) {
let sphere = MeshResource.generateSphere(radius: 1000)
let material = VideoMaterial(avPlayer: player)
let entity = ModelEntity(mesh: sphere, materials: [material])
entity.scale *= .init(x: -1, y: 1, z: 1) // Flip to render on inside
content.add(entity)
player.play()
}
The setup is straightforward: MeshResource.generateSphere(radius: 1000) generates the sphere mesh VideoMaterial(avPlayer:) provides the video texture X scale is flipped to -1 so the texture renders on the inside of the sphere The video is a standard equirectangular 360° MP4 file
What I've tried:
I attempted to create a custom sphere mesh using MeshDescriptor with duplicate vertices at the UV seam (longitude 0°/360°) to ensure proper UV continuity. However, VideoMaterial did not render any video on the custom mesh (only audio played), and the app eventually crashed. It seems VideoMaterial may have specific mesh requirements.
Questions:
Is the black seam a known limitation of MeshResource.generateSphere() when used with VideoMaterial for 360° video?
Is there a recommended way to eliminate this UV seam — for example, a texture addressing mode or a specific mesh configuration that works with VideoMaterial?
Is there an official sample project or code example for playing 360° equirectangular video in a fully immersive space on visionOS? That would be extremely helpful as a reference.
Any guidance would be greatly appreciated. Thank you!
Member has access to App Store Connect but no access to the Apple Developer Program on the same team
I am enrolled in an individual account and have the appropriate previliges on the App Store Connect portal. However upon trying to access the Certificates page, it presents an error message "Unable to find a team with the given Team ID 'to which you belong"
My question is, is it possible while adding me to the team, it could possibly only have added me on the App Store Connect portal only and not also on the Apple Developer? How would I be explicitly added also through the Apple Developer account?
Made a notarization request a few hours ago and woke up to check the history and it's no longer available. Not rejected/accepted just not found. I have gone ahead to make another request but I have no confidence because I expect the same thing to happen again. Any guidance?
See logs below:
daramfon@MacBook-Pro-3 frontend % xcrun notarytool history --apple-id "$APPLE_ID" --password "$APPLE_APP_SPECIFIC_PASSWORD" --team-id "$APPLE_TEAM_ID"
Successfully received submission history.
history
--------------------------------------------------
createdDate: 2026-02-20T23:53:14.066Z
id: 6f2fadc0-2e8f-4331-a253-68f81334ebc6
name: Speakeasy AI-0.1.0-arm64.zip
status: In Progress
--------------------------------------------------
createdDate: 2026-02-20T23:47:12.897Z
id: 435aec4f-5356-49a5-898d-48aaafb7949f
name: Speakeasy AI.zip
status: In Progress
--------------------------------------------------
createdDate: 2026-02-20T22:35:23.947Z
id: 95896757-873a-4e54-a527-03dc767c9cb5
name: Speakeasy AI.zip
status: In Progress
daramfon@MacBook-Pro-3 frontend % xcrun notarytool history --apple-id "$APPLE_ID" --password "$APPLE_APP_SPECIFIC_PASSWORD" --team-id "$APPLE_TEAM_ID"
No submission history.
daramfon@MacBook-Pro-3 frontend % xcrun notarytool info 6f2fadc0-2e8f-4331-a253-68f81334ebc6 --apple-id "$APPLE_ID" --password "$APPLE_APP_SPECIFIC_PASSWORD" --team-id "$APPLE_TEAM_ID"
Submission does not exist or does not belong to your team.
id: 6f2fadc0-2e8f-4331-a253-68f81334ebc6
My phone says ‘Slow Charger’ even if it’s Apple original power adaptor and cable. It‘s new and i think it’s a software issue. If it is then please fix it. If it’s not i’ll try fixing it myself. And a few things that i don’t like about iOS 26.4 are: Overheating, Safari Bugs, Apple Music UI not working. Please also try to fix them.
Hello,
I am attempting to request the endpoint-security.client entitlement for my app using the following form:
https://developer.apple.com/contact/request/system-extension/
After submitting the form, I consistently receive an HTTP 500 error from Apple’s servers.
Could you please provide guidance on whether this is a known issue or if there is something I may be doing incorrectly?
I appreciate your assistance.
My swift student challenge submission is an iPad app built in Xcode and I'm planning on selecting the Xcode 26 option for testing in the dropdown provided in the application. Just have to confirm that the run destination for the playground in Xcode will be an iPad simulator right? Recently I have seen many participants post their submission screenshot for iPhone renders so just wanted to confirm the run destination.
Thank you👾
Since the introduction of the siblings / and /System/Volumes/Data architecture, some very basic, critical commands seems to have a broken behaviour ( cp, rsync, tar, cpio…).
As an example, ditto which was introduced more than 10 years ago to integrate correctly all the peculiarity of HFS Apple filesystem as compared to the UFS Unix filesystem is not behaving correctly.
For example, from man ditto:
--rsrc Preserve resource forks and HFS meta-data. ditto will
store this data in Carbon-compatible ._ AppleDouble files
on filesystems that do not natively support resource forks.
As of Mac OS X 10.4, --rsrc is default behavior.
[...]
--extattr Preserve extended attributes (requires --rsrc). As of Mac
OS X 10.5, --extattr is the default.
and nonetheless:
# ls -@delO /private/var/db/ConfigurationProfiles/Store
drwx------@ 5 root wheel datavault 160 Jan 20 2024 /private/var/db/ConfigurationProfiles/Store
*********
com.apple.rootless 28
***************************
# mkdir tmp
# ditto /private/var/db/ConfigurationProfiles tmp
ditto: /Users/alice/Security/Admin/Apple/APFS/tmp/Settings: Operation not permitted
ditto: /Users/alice/Security/Admin/Apple/APFS/tmp/Store: Operation not permitted
# ls -@delO tmp/Store
drwx------ 5 root wheel - 160 Aug 8 13:55 tmp/Store
*
#
The extended attribute on copied directory Store is empty, the file flags are missing, not preserved as documented and as usual behaviour of ditto was since a long time ( macOS 10.5 ).
cp, rsync, tar, cpio exhibit the same misbehaviour. But I was using ditto to be sure to avoid any incompatibility with the Apple FS propriaitary modifications.
As a consequence, all backup scripts and applications are failing more or less silently, and provide corrupted copies of files or directories. ( I was here investigating why one of my security backup shell script was making corrupted backups, and only on macOS ).
How to recover the standard behaviour --extattr working on modern macOS?
Topic:
App & System Services
SubTopic:
Core OS
Tags:
Files and Storage
macOS
Security
Security Foundation
I am running postfix on macOS Sequoia, and need it to log
any kind of error to fix them.
I found that in this version of macOS, syslogd is configured
with the file /etc/asl/com.apple.mail, which contains:
# mail facility has its own log file
? [= Facility mail] claim only
> /var/log/mail.log mode=0644 format=bsd rotate=seq compress file_max=5M all_max=50M
* file /var/log/mail.log
which is its install configuration and seems correct.
Postfix is started ( by launchd ) and running ( ps ax | grep master ), but on sending
errors occur, and nothing is logged.
How to make postfix to log in /var/log/mail.log
which is the normal way on millions of postfix servers
around the world?
Topic:
App & System Services
SubTopic:
Core OS
Hello,
I hope you’re doing well.
I recently submitted a new app (App ID: 6759405609), which is currently in “Waiting for Review.” I understand that review times can vary, and I truly appreciate the work the review team does.
My first app took almost a week to be reviewed, so I was wondering if timelines are currently similar across the board or if they may vary depending on factors like app category or developer history.
I have a launch commitment with a client scheduled for this Monday, so I’m simply trying to plan accordingly and set realistic expectations.
Any insight would be greatly appreciated.
Thank you very much for your time.
Kindest regards,
Jose Rodriguez
Topic:
App Store Distribution & Marketing
SubTopic:
App Review
Hi,
ios 26.4 beta
iPhone 13 Pro Max
After I installed this beta - my lost an opportunity of fas charge. I tried different cables and blocks - nothing. Now it says - "slow charge" and to charge from 20% to 80% I need more than 2 hours.
Topic:
App & System Services
SubTopic:
Core OS
Hi,
I would like clarification on whether the new hover effects feature introduced in vision os 26 supported pinch gestures through the psvr 2 controllers.
In your sample application, I found that this was not working. Pulling the trigger on the controller whilst looking at the 3d object did not activate the hover effect spatial event in the sample application. (The object is showing the highlight though), only pinch clicking with my fingers seem to be registering/triggering the spatial event.
I am using Vision OS 26.3
This is inconsistent with how the psvr2 controller behaves on swift ui views and ui view elements, where the trigger press does count as a button click.
The sample I used was this one:
https://developer.apple.com/documentation/compositorservices/rendering_hover_effects_in_metal_immersive_apps
When asking Siri to run a shortcut, it will spawn two processes called BackgroundShortcutRunner that do not die when the shortcut is done running. If the Siri window is on screen when I speak to have the shortcut run it does not spawn those two processes. However if the Siri window is not on screen when I speak, the Siri window appears and spawns these two processes. The two processes do not terminate once the shortcut is done running. I now have over 200 these processes since rebooting three days ago to install 26.4.
FB22015192
Topic:
App & System Services
SubTopic:
Automation & Scripting
Hello Apple Team,
Please, we need you to come back to the level of support and consistency we used to have.
Right now, the review process and the assistance we’re receiving don’t reflect the standards we’ve known from Apple.
Developers don’t deserve this style of support; long waits, unclear rejections, and no real answers.
We rely on the App Store to run our businesses, and we simply need the professionalism, clarity, and responsiveness that Apple was always known for.
Please help bring that quality back. We deserve better
Thank you.
I created a form field using:
On Safari and Chrome desktop, it behaves as expected. Safari shows the current date in grey by default, and Chrome displays a format hint like dd.mm.yyyy, which is perfectly fine.
On iOS, however, the field appears completely blank. I understand that the placeholder attribute is not part of the iOS date input behavior, which is technically fine. Still, it would be helpful if developers had the option to define a default display value. In the past, browsers prefilled date inputs, but many developers objected because they needed the field to be empty by default.
I have searched extensively and tried several AI tools, and everywhere it says that this cannot be changed. Am I missing something, or is there any way to display a placeholder, the current date, or some kind of visual hint in iOS Safari?
Right now, the empty field creates poor UX because users may overlook it. Since the field is required, this can easily lead to validation errors and additional friction.
As a workaround, I used a CSS hack with input[type="date"]::before and a content attribute. I also added JavaScript to toggle a pseudo-placeholder value specifically for iOS.
Is there a cleaner solution that avoids this workaround?
Thanks in advance for your guidance.