The new Core Spotlight APIs in 18.4 and aligned releases for using Apple Intelligence models to summarize messages sort of work? But I'd say in my testing that only about 80% of the requests I send to Spotlight come back to the delegate with summaries and the rest are never returned. No errors logged in the delegate methods.
I can't figure out if there's a pattern. Before I dig apart my code, I'm wondering if anyone else here is using these brand-new APIs and has seen anything similar.
It's odd because my code to submit to Spotlight to be summarized is the same for all of my entities but some just never seem to be returned.
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Both Mac and iPhone iPad
Topic:
App & System Services
SubTopic:
General
Hi all. I need to save an array of strings in userdefaults. I am using
NSKeyedArchiver.archivedData(withRootObject: rootObject, requiringSecureCoding: false)
to convert array of string to data and then save it in userdefaults.
Inorder to retrieve the data back, I am using
let data = self.userDefaults.data(forKey: "key")!
let unarchiver = try NSKeyedUnarchiver(forReadingFrom: data)
unarchiver.requiresSecureCoding = false
let array = unarchiver.decodeObject(forKey: NSKeyedArchiveRootObjectKey) as? NSObject
This was working perfectly till iOS 18. From iOS 18 in couple of devices, we are getting empty string array while we retrieve the value back from userdefaults. We observed this in an iPhone 12 pro and iPhone 15 running on iOS 18. iPhone 12 pro is facing this issue almost once everyday. In iPhone 15 this happens once in 2-3 days.
Tried printing raw data directly from userdefaults. And I can see some data available. But when we convert that back to array of string, I am getting empty. Tried adding logs in catch block. But couldn't get any. What might be the cause of this issue?
I want to create a brush similar to a fountain pen, with a three-dimensional feel to the strokes and a distinct tip. Alternatively, is it possible to achieve this by modifying the configuration parameters of a fountain pen brush?
Since iOS 18.3.1, In lower iOS versions it works fine though.
QLPreviewController shows a blank white screen instead of showing the document. Additionally, it does not display the 'Done' option at the top-right to close the view.
Presenting the QLPreviewController works fine to display the document, but for the second time, it renders the blank white screen as described.
While launching QLPreviewControllerView for the first time. I'm receiving the following message in the console and it displays the document.
LaunchServices: store (null) or url (null) was nil: Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database" UserInfo={NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
Attempt to map database failed: permission was denied. This attempt will not be retried.
Closing the QLPreviewController with the help of the 'Done' option from top-right or swipe to close triggers the following message in the console.
Connection to appex interrupted
AX Lookup problem - errorCode:1100 error:Permission denied portName:'com.apple.iphone.axserver' PID:1022 (
0 AXRuntime 0x00000001d2cd7758 _AXGetPortFromCache + 796
1 AXRuntime 0x00000001d2cdd02c AXUIElementPerformFencedActionWithValue + 700
2 UIKit 0x0000000258cdf488 7F0274D9-D3C9-3193-B606-1C74BE53B86C + 1537160
3 libdispatch.dylib 0x0000000101bb888c _dispatch_call_block_and_release + 32
4 libdispatch.dylib 0x0000000101bba578 _dispatch_client_callout + 20
5 libdispatch.dylib 0x0000000101bc2454 _dispatch_lane_serial_drain + 840
6 libdispatch.dylib 0x0000000101bc325c _dispatch_lane_invoke + 408
7 libdispatch.dylib 0x0000000101bd06fc _dispatch_root_queue_drain_deferred_wlh + 328
8 libdispatch.dylib 0x0000000101bcfd0c _dispatch_workloop_worker_thread + 580
9 libsystem_pthread.dylib 0x0000000225ea4680 _pthread_wqthread + 288
10 libsystem_pthread.dylib 0x0000000225ea2474 start_wqthread + 8
)
Trying to open he document again, Ultimately results in the white blank screen to be displayed with no options to close.
It displays the Navigation bar only for the fraction of time. Leading users to force close the app and start again.
Hi all,
From what I’ve seen on forums and other sources, it appears that nothing can be done to set the contact poster programmatically. Setting the imageData property affects only the thumbnail image. Does anyone know if this is explicitly documented somewhere? I need this information for a POC document. I watched the iOS 17 keynote (where it was introduced), the Platform State of Union, and other WWDC videos, but I couldn’t find any mention of it. The Contacts framework documentation only explains what can be retrieved from this property and doesn’t mention any way to set the contact poster.
If anyone has any information on this, please help!
Thanks in advance!
I want to create a brush similar to a fountain pen, with a three-dimensional feel to the strokes and a distinct tip. Alternatively, is it possible to achieve this by modifying the configuration parameters of a fountain pen brush?
I'd like to set the recordingYear in my Spotlight File Importer extension but the property is missing from CSSearchableItemAttributeSet
e.g. in the resulting in mdls I'd like to see:
kMDItemRecordingYear = 2008;
This would allow me to search in Finder by the recording year criteria.
There is a recordingDate property and I tried setting it to Date that only has a year but it didn't work. It just resulted in this:
kMDItemRecordingDate = "2008-01-01 00:00:00 +0000";
I'm unable to have stickers show in messages even with a new iOS app and a sticker pack extension target.
I do see the iMessage App Icon but after tapping it nothing shows and I see a warning: "Error creating the CFMessagePort needed to communicate with PPT"
This was tested on simulator and on real device.
Xcode 16.1 (16B40)
iOS 18.1 & 18.2
Hello! I'm working with universal links in my app and have configured the /.well-known/apple-app-site-association file. Currently, I use the paths array in this file to define URL routing rules. However, I’m struggling to find up-to-date documentation on the pattern syntax supported by the paths field.
"paths": [
"/page/*",
"NOT /page/*/subpage"
]
Could someone clarify:
Is the paths array still officially supported, or is it deprecated in favor of the newer components dictionary (as referenced here https://developer.apple.com/documentation/bundleresources/applinks/details-swift.dictionary/components-swift.dictionary)?
If paths is still valid, where can I find documentation for its pattern-matching capabilities?
I want to ensure my implementation aligns with Apple’s current best practices. Thank you!
Hello.
I'm researching information about Live Caller ID. As I understood Apple using Privacy Pass architecture to retrieve information from the server. I have a question about Privacy Pass Token, where does it stores on device?
Topic:
App & System Services
SubTopic:
General
Hey everyone,
I have an app using the screen time api, I've had quite a few reports from users saying that our monitoring features stop working until they open our app. What happens is that activities and schedules set with the device activity monitor seem to disappear. This is something we check on app re-opens and so we schedule them again and that is why the monitoring starts working again.
Of course our current solution is not optimal since our app is mainly passive. Has anyone experienced these kinds of issue ?
Hi there,
My app uses all the Screen Time API's with individual FamilyControls authorization. I've been using the API's for over 2 years (since they came out).
In iOS 18 Beta (maybe started in Beta 3?), I've been experiencing random issues. I tracked it down to where it seems like DeviceActivityMonitor extension is more likely to deadlock in iOS 18.
To reproduce: when DeviceActivityMonitorExtension.intervalDidEnd gets called, IF you call DeviceActivityCenter.startMonitoring for that SAME DeviceActivityName from the DeviceActivityMonitorExtension , the startMonitoring call deadlocks (if I pause debugger, it does not advance past DeviceActivityCenter.startMonitoring).
The bug is reported in FB14664238. It also contains a sample project where you can reproduce this.
I also note in the comment section that this is not the only way to encounter this problem. My application code (which is a lot more complicated) seems to deadlock on calling DeviceActivityCenter.activities. As a result, there seems to be an "overall trend" where, due to some changes, DeviceActivityMonitor extension is more likely to deadlock.
The steps are not reproducible on iOS 17.6. This is built using Xcode 17.4.
Thank you! 🙏
Topic:
App & System Services
SubTopic:
General
Tags:
Family Controls
Device Activity
Managed Settings
var format = "%7B%22sign%22%3Anull%2C%22company%22%3A%22%E5%85%84%E5%BC%9F%E6%B5%B7%E6%B4%8B%E7%A7%91%E6%8A%80%E6%9C%89%E9%99%90%E5%85%AC%E5%8F%B8%22%2C%22businessNo%22%3Anull%2C%22scene%22%3Anull%2C%22interviewCode%22%3A%22767676%22%7D"
let message = withVaList([]) { args in
let msg = NSString(format: format, arguments: args)
print(msg)
}
Topic:
App & System Services
SubTopic:
General
Tags:
Foundation
iPad and iOS apps on visionOS
Swift
Debugging
Is CallKit still not available in certain countries? like China? If it is, is there a way to get a list of countries?
As I've mentioned before our app uses PTT Framework to record and send audio messages. In one of supported by app mode we are using WebRTC.org library for that purpose. Internally WebRTC.org library uses Voice-Processing I/O Unit (kAudioUnitSubType_VoiceProcessingIO subtype) to retrieve audio from mic. According to https://developer.apple.com/documentation/avfaudio/avaudiosession/mode-swift.struct/voicechat using Voice-Processing I/O Unit leads to implicit enabling .voiceChat AVAudioSession mode (i.e. it looks like it's not possible to use Voice-Processing I/O Unit without .voiceChat mode).
And problem is following: when user starts outgoing PTT, PTT Framework plays audio notification, but in case of enabled .voiceChat mode that sound is playing distorted or not playing at all.
Questions:
Is it known issue?
Is there any way to workaround it?
Hi guys,
I'm developing a FinderSync Extension that extends Finder contextual menu with a couple of items doing some trivial file operations.
I'm using Xcode 16.2 on macOS Sequoia 15.3.2
I could run the containing app in debug, and in System Preferences -> File Providers the flag is enabled for my app finder extension.
Anyway, the contextual menu does not show in Finder, probably because the finder extension crashes immediately.
Some output:
pluginkit -m | grep "com\.[^a]"
+ com.mycompany.MyApp.MyAppFinderExtension(1.1.14)
codesign -dvvv --entitlements - /Users/me/Library/Developer/Xcode/DerivedData/MyApp-dmzhnwmosboixodalsrrbwvwvmqm/Build/Products/Debug/MyApp.app/Contents/PlugIns/MyApp\ Finder\ Extension.appex
Executable=/Users/me/Library/Developer/Xcode/DerivedData/MyApp-dmzhnwmosboixodalsrrbwvwvmqm/Build/Products/Debug/MyApp.app/Contents/PlugIns/MyApp Finder Extension.appex/Contents/MacOS/MyApp Finder Extension
Identifier=com.mycompany.MyApp.MyAppFinderExtension
Format=bundle with Mach-O thin (arm64)
CodeDirectory v=20400 size=659 flags=0x2(adhoc) hashes=9+7 location=embedded
Hash type=sha256 size=32
CandidateCDHash sha256=b59538ef9e3b6e8cf462a3e260e3bf26d050deb5
CandidateCDHashFull sha256=b59538ef9e3b6e8cf462a3e260e3bf26d050deb5e21fb27d4fa0a4fe5f3e78b7
Hash choices=sha256
CMSDigest=b59538ef9e3b6e8cf462a3e260e3bf26d050deb5e21fb27d4fa0a4fe5f3e78b7
CMSDigestType=2
CDHash=b59538ef9e3b6e8cf462a3e260e3bf26d050deb5
Signature=adhoc
Info.plist entries=23
TeamIdentifier=not set
Sealed Resources version=2 rules=13 files=9
Internal requirements count=0 size=12
[Dict]
[Key] com.apple.security.app-sandbox
[Value]
[Bool] true
[Key] com.apple.security.application-groups
[Value]
[Array]
[String]
[Key] com.apple.security.assets.movies.read-write
[Value]
[Bool] true
[Key] com.apple.security.assets.music.read-write
[Value]
[Bool] true
[Key] com.apple.security.assets.pictures.read-write
[Value]
[Bool] true
[Key] com.apple.security.files.documents.read-write
[Value]
[Bool] true
[Key] com.apple.security.files.downloads.read-write
[Value]
[Bool] true
[Key] com.apple.security.files.user-selected.read-write
[Value]
[Bool] true
[Key] com.apple.security.get-task-allow
[Value]
[Bool] true
[Key] com.apple.security.personal-information.location
[Value]
[Bool] true
The log is showing something very strange:
log show --predicate 'eventMessage contains "com.mycompany.MyApp" and messageType = error' --last 1h
Filtering the log data using "composedMessage CONTAINS "com.mycompany.MyApp" AND logType == 16"
Skipping info and debug messages, pass --info and/or --debug to include.
Timestamp Thread Type Activity PID TTL
2025-03-25 10:20:48.428127+0100 0x221af Error 0x53dbe 159 0 tccd: [com.apple.TCC:access] Request message contains a target_token to accessing_process (TCCDProcess: identifier=com.mycompany.MyApp, pid=4140, auid=501, euid=501, binary_path=/Users/me/Library/Developer/Xcode/DerivedData/MyApp-dmzhnwmosboixodalsrrbwvwvmqm/Build/Products/Debug/MyApp.app/Contents/MacOS/MyApp) but TCCDProcess: identifier=com.apple.audio.coreaudiod, pid=184, auid=202, euid=202, binary_path=/usr/sbin/coreaudiod is not a TCC manager for service: kTCCServiceScreenCapture.
2025-03-25 10:20:53.166554+0100 0x22139 Error 0x67ff4 163 0 runningboardd: (RunningBoard) [com.apple.runningboard:general] RBSStateCapture remove item called for untracked item 163-158-7088 (target:[app<application.com.mycompany.MyApp.36628067.36635236.92E24CD3-97A8-4340-A46E-4493456283C7(501)>:4140])
2025-03-25 10:20:53.166575+0100 0x22139 Error 0x67ff4 163 0 runningboardd: (RunningBoard) [com.apple.runningboard:general] RBSStateCapture remove item called for untracked item 163-158-7087 (target:[app<application.com.mycompany.MyApp.36628067.36635236.92E24CD3-97A8-4340-A46E-4493456283C7(501)>:4140])
2025-03-25 10:20:53.166582+0100 0x22139 Error 0x67ff4 163 0 runningboardd: (RunningBoard) [com.apple.runningboard:general] RBSStateCapture remove item called for untracked item 163-158-7091 (target:[app<application.com.mycompany.MyApp.36628067.36635236.92E24CD3-97A8-4340-A46E-4493456283C7(501)>:4140])
2025-03-25 10:20:53.166593+0100 0x22139 Error 0x67ff4 163 0 runningboardd: (RunningBoard) [com.apple.runningboard:general] RBSStateCapture remove item called for untracked item 163-132-7084 (target:[xpcservice<com.apple.finder.FinderSync.IsExtensionEnabled([app<application.com.mycompany.MyApp.36628067.36635236.92E24CD3-97A8-4340-A46E-4493456283C7(501)>:4140])(501)>{vt hash: 0}:4144:4144])
--------------------------------------------------------------------------------------------------------------------
Log - Default: 0, Info: 0, Debug: 0, Error: 13, Fault: 0
Activity - Create: 0, Transition: 0, Actions: 0
Especially the first line: MyApp is not accessing coreaudiod - neither trying to get a ScreenCapture, so ... WTF???
How is it possibile?
Why is the system blocking MyApp FinderExtension, or preventing it to run?
Thank you in advance
_Alex
Topic:
App & System Services
SubTopic:
General
Tags:
Finder Sync
Entitlements
Signing Certificates
App Sandbox
Hello,
I am unable to figure out how I tell the FamilyActivityPicker whether it should show apps installed on my personal device (to be used with AuthorizationCenter.shared.requestAuthorization(for: .individual)) or apps installed on my child’s device (authenticated their phone via AuthorizationCenter.shared.requestAuthorization(for: .child)).
Is there any parameter or SwiftUI modifier I need to apply?
Otherwise, how does the user or the app know which token belongs to them and which token belongs to their child’s device?
Radar: FB17020977
Thanks a lot for your help!
When we request auth from the AuthorizationCenter, it seems that we're only really able to allow users to control the apps on the parent's phone. Is there a way to allow us to let parents manage apps on the kid's device directly through our parent app?
For context, we have 2 different apps, one for the parent and one for the child. The child is able to purchase screen time and the parent can redeem them (activate those minutes) from their end.
I'm currently working with the FamilyControls API and testing my app on two different devices. Both apps are in the same family-sharing network with one phone being the owner of the network (I'll call this A) and the other one being an adult in the network(I'll call this B).
When device A picks apps using the FamilyActivityPicker, it shares that selection with device B (via encoding, sending over network, and decoding on device B). However, interacting with the token (displaying it, using it in shield) throws an error saying the token is null.
From the documentation, I thought every token would be the same across all devices in the family sharing network. So my question:
How do I send the FamilyActivitySelection from A to B and have the tokens still be functional?
Does this functionality only work if A is a "parent" and B is a "child" in the family sharing network?
Also, side note:
If I reverse the process and send the tokens from B to A. Interacting with the token works exactly as expected. For some reason, it's only going from A to B where it doesn't work.
Topic:
App & System Services
SubTopic:
General
Tags:
Application Services
Family Controls
Screen Time