Hello,
We are developing an iAP2 accessory and encountering an issue during the Identification phase.
Issue:
Authentication: ✅ Successful
Identification: ❌ IdentificationInformation rejected (0x1D03)
Product Plan Status: "Submitted" (in MFi Portal)
What we've verified:
ProductPlanUID matches MFi Portal
All required parameters per R43 Table 101-9 are included
Parameters are in ascending order by ID
Message format appears correct
Observation:
iPhone accepts the message format but still rejects IdentificationInformation, suggesting the issue may be related to Product Plan configuration or status rather than parameter format.
Questions:
Can a Product Plan with status "Submitted" complete iAP2 identification, or does it need to be "Approved"?
Are there any Product Plan configuration requirements that might not be visible in MFi Portal?
Should we configure "Control Message Lists" in Product Plan? (We don't see this option in Portal)
We can provide additional technical details through secure channels if needed.
Thank you for your assistance.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
We are facing the issue of commissioning our Matter device to google home through iOS device will be 100% failed.
Here is our test summary regarding the issue:
TestCase1 [OK]: Commissioning our Matter 1.4.0 device to Google Nest Hub 2 by Android device (see log
DoorWindow_2.0.1_Google_Success.txt
)
TestCase2 [NG]: Commissioning Matter 1.4.0 device to Google Nest Hub 2 by iPhone13 or iPhone16 (see log
DoorWindow_2.0.1_Google_by_iOS_NG.txt
)
TestCase3 [OK]: Commissioning our Matter 1.3.0 device to Google Nest Hub 2 by iPhone13
In TestCase2, we noticed that device was first commissioned to iOS(Apple keychain) then iOS opened a commissioning window again to commission it in Google’s ecosystem, and the device was failed at above step 2, so we also tried:
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on iOS, this also fails.
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on Android, this works as expected and device pops up in Google home of iOS as well.
Could you help check what's the issue of TestCase2?
Append the environment of our testing:
NestHub 2 version
Google Home APP version
Hi,
I’m developing a Matter commissioning flow and would like to clarify Apple Home’s support for concatenated (multi-device) QR codes.
In my implementation, I generate a single QR code that contains multiple Matter onboarding payloads (concatenated payloads), intended to commission multiple devices in one scan, similar to a multi-pack / multi-accessory flow.
What I’ve tested:
Standard single-device Matter QR codes work as expected in the Apple Home app
A concatenated QR code (multiple Matter payloads combined into one QR) does not get recognized / commissioned by Apple Home
My questions:
Does Apple Home officially support commissioning via concatenated or multi-device Matter QR codes?
If yes, is there a specific payload format or delimiter that Apple Home expects?
If not, is this a known limitation or something planned for future iOS/Home releases?
Hello everyone,
I am developing an iOS application that relies on accelerometer data for precise motion and reaction-time measurements.
Based on practical testing, it appears that third-party iOS applications receive accelerometer data at a maximum rate of approximately 100 Hz, regardless of hardware capabilities or requested update intervals.
I would like to ask for clarification on the following points:
Is there an officially supported way for third-party iOS apps to access accelerometer data at sampling rates higher than ~100 Hz?
If the hardware supports higher sampling rates, is this limitation intentionally enforced at the iOS level for third-party applications?
Are there any public APIs, entitlements, or documented approaches that allow access to higher-frequency sensor data, or is this restricted to system/internal components only?
Thank you in advance for any clarification.
Topic:
App & System Services
SubTopic:
Hardware
I followed the instructions on the page https://mfi.apple.com/en/help/login-help/How-to-Register-Your-Existing-Apple-ID.html to apply for the MFi Program. According to step 7 of the guide: "You have now created and registered your Apple Account. You will be automatically directed to the MFi Portal to begin the enrollment process," I should have been taken to the enrollment process after logging in. However, instead of accessing the enrollment page, a pop-up message appears stating: "The Apple Account you signed in with does not have permission to view this page. If you believe your company is currently enrolled in the MFi Program, please contact your company’s Account Administrator to request access to the MFi Portal. If your company is not currently enrolled in the MFi Program, please click here to learn about the program and start the enrollment process." This has created an endless loop—I cannot proceed to the enrollment process as instructed, and the pop-up only redirects me to information that leads back to the same login and permission issue. Could you please provide guidance on how to resolve this and successfully access the MFi Program enrollment process?
Topic:
App & System Services
SubTopic:
Hardware
Summary
On Mac Studio systems (no built-in camera), macOS does not initialize camera services after a normal reboot if no physical camera is present. As a result, Continuity Camera does not appear anywhere in the system.
Observed behavior
System Information → Camera reports “No video capture devices were found.”
Continuity Camera (iPhone) is completely absent from camera lists.
Plugging in any USB UVC webcam immediately initializes camera services and causes both the USB camera and the iPhone (Continuity Camera) to appear.
The USB camera can then be unplugged and Continuity Camera continues working until the next reboot.
Reproduction steps
Use a Mac Studio (no built-in camera) on recent macOS.
Ensure no USB webcam or external camera is connected.
Reboot the Mac normally.
After login, open System Information → Camera.
Expected
Camera services should initialize even when no physical camera is present, allowing Continuity Camera to be available as the primary camera.
Actual
No camera devices are present unless a physical USB camera is connected at least once after boot.
This reproduces 100% of the time on Mac Studio and appears to be a camera service bootstrap issue where Continuity Camera cannot be the first camera device.
Issue has been filed via Feedback Assistant.
Hi everyone,
We are currently exploring ways to implement a frictionless Wi-Fi setup for our hardware devices without requiring a dedicated third-party application. We are interested in leveraging Apple's WAC (Wireless Accessory Configuration) to sync Wi-Fi credentials directly from iOS devices. However, we have struggled to find comprehensive technical documentation or specifications regarding the WAC service. Could anyone point us to the official source for these materials?
Additionally, we have a couple of technical questions:
1.We are testing WAC provisioning and found that the Home app can discover our device and successfully get it online. However, it always ends with a "Failed to add accessory" message.
Does WAC support imply that a device should be addable via the Home app? If not, why is the Home app able to discover and start the setup for a non-HomeKit WAC device?
2. Our device is already Apple AirPlay certified. Does implementing WAC require additional standalone certification, or is it covered under the existing MFi/AirPlay certification umbrella?
Any insights or guidance would be greatly appreciated. Thank you!
Hello,
I am a Bluetooth Engineer at Google investigating an interoperability bug between an Android device and AirPods 4. When requesting an L2CAP connection (with PSM = AVDTP) to the AirPods during SDP service discovery, The AirPods L2CAP layer incorrectly responds with a "refused - no resources available" status followed by a Pending status and a Success status. This violates the specification, which says that the request has been fully rejected after the refused status and should not receive followup responses. I suspect the "no resources available" response is a bug. This prevents A2DP from working with the AirPods.
This bug does not exist with AirPods 2 firmware.
Here is a packet capture:
1602 1969-12-31 16:07:04.805261 0.062473 localhost () Apple_6b:db:09 (AirPods) L2CAP 17 Sent Connection Request (AVDTP, SCID: 0x22c6)
1603 1969-12-31 16:07:04.810953 0.005692 controller host HCI_EVT 8 Rcvd Number of Completed Packets
1604 1969-12-31 16:07:04.811078 0.000125 Apple_6b:db:09 (AirPods) localhost () SDP 27 Rcvd Service Search Attribute Request : Device Information: [Bluetooth Profile Descriptor List 0x0009]
1605 1969-12-31 16:07:04.821249 0.010171 localhost () Apple_6b:db:09 (AirPods) SDP 19 Sent Service Search Attribute Response
1606 1969-12-31 16:07:04.876396 0.055147 controller host HCI_EVT 8 Rcvd Number of Completed Packets
1607 1969-12-31 16:07:04.876464 0.000068 Apple_6b:db:09 (AirPods) localhost () L2CAP 21 Rcvd Connection Response - Refused - no resources available (SCID: 0x22c6)
1608 1969-12-31 16:07:04.942539 0.066075 Apple_6b:db:09 (AirPods) localhost () SDP 41 Rcvd Service Search Attribute Request : Unknown: [Bluetooth Profile Descriptor List 0x0009]
1609 1969-12-31 16:07:04.951052 0.008513 localhost () Apple_6b:db:09 (AirPods) SDP 19 Sent Service Search Attribute Response
1610 1969-12-31 16:07:05.010605 0.059553 controller host HCI_EVT 8 Rcvd Number of Completed Packets
1611 1969-12-31 16:07:05.080593 0.069988 Apple_6b:db:09 (AirPods) localhost () SDP 27 Rcvd Service Search Attribute Request : GATT: [Bluetooth Profile Descriptor List 0x0009]
1612 1969-12-31 16:07:05.087636 0.007043 localhost () Apple_6b:db:09 (AirPods) SDP 19 Sent Service Search Attribute Response
1613 1969-12-31 16:07:05.209417 0.121781 controller host HCI_EVT 8 Rcvd Number of Completed Packets
1614 1969-12-31 16:07:05.279491 0.070074 Apple_6b:db:09 (AirPods) localhost () L2CAP 21 Rcvd Connection Response - Pending (SCID: 0x22c6)
1615 1969-12-31 16:07:05.280731 0.001240 Apple_6b:db:09 (AirPods) localhost () L2CAP 21 Rcvd Connection Response - Success (SCID: 0x22c6, DCID: 0x0406)
Please file this bug with the AirPods Bluetooth team.
Hello Apple Developer Technical Support Team,
I’m working on an iOS banking/security SDK and we’re trying to match an Android feature that reads payment cards via NFC (EMV). On Android, this is implemented using an NFC scanning screen (e.g., “NfcScanActivity”) that can read EMV data from contactless credit/debit cards.
Could you please clarify the current iOS capabilities and App Store policy around this?
On iOS, is it currently possible for a third-party App Store app to read contactless credit/debit cards using Core NFC (i.e., accessing EMV application data/AIDs from payment cards)?
If this is possible, what are the supported APIs/frameworks and any entitlement requirements (if applicable)?
If this is not possible for App Store apps, could you recommend the closest acceptable alternatives for achieving a similar user outcome? For example:
Using Apple Pay / PassKit flows for payment-related experiences
Card scanning alternatives (camera-based OCR) for capturing card details (if allowed)
Using an external certified card reader accessory (MFi) and required approach/entitlements
Any other Apple-recommended approach for “card verification / identification” without reading EMV NFC data
Our goal is not to bypass security restrictions, but to provide a compliant solution on iOS comparable to Android’s NFC-based card reading, or to adopt an Apple-approved alternative if direct EMV reading is not supported.
If helpful, I can share a brief technical summary of the Android behavior and the exact data we need to obtain (e.g., whether it’s card presence verification vs. reading specific EMV tags).
Thank you for your guidance.
Best regards,
Anis
Topic:
App & System Services
SubTopic:
Hardware
as i want to tract activity of iphone user using core motion framework , guide me through .
Hello everybody,
I have a never ending issue with appstore review, an need a QUICK HELP !
I am submitting a new app (oral training), for Iphones only.
I disabled other devices (such as Ipas) via Xcode.
In the appstore informations form, it is obligatory to provide ipad screens, so I provided screens showing Iphone experience.
Appstore team asked me to remove it because I don't authorize Ipads. But if I remove those screens, form cannot be sent.
I don't understantd how to proceed.
Thanks for the help
Regards
Jean
Will UVC native support come for the Iphone as well?
So, using external cameras with the ipad is greatly beneficial, but for the iphone, it can make it a production powerhouse!
So, have there been discussions around bringing UVC support for the Iphone as well? and if so, what were your conclusions?
Dear Sir,
I have some questions of IC firmware development of Find My.
Any rule request that item must include dual bank feature in IC?
I am using Nordic SDK_Connect SDK, Apple has own SDK? If yes, can I download it to use?
In Find-My, Apple has service UUID in bluetooth IC?
Thank you.
Best regards,
Sam Ng
Hey everyone, how’s it going?
I’d like to know if, by enrolling in Apple’s MFi program, I’ll gain access to develop my own tags and my own app to track them using Apple’s Find network. I also read that there’s an estimated cost of $4 per device—does that apply to each device produced, or only at the time of registering the device, with no fee for additional units?
How to remove Matter accessory connection artefacts? This appears after connecting and then removing a Matter test accessory. Please see attached screenshot:
when I go to software and update it says can’t check for updateS.
Topic:
App & System Services
SubTopic:
Hardware
I updated my iPhone to 26 and I just went to to to see if my iPhone was up to date I went and see but a came across this
Hi everyone,
while testing HKWorkoutSession with HKLiveWorkoutBuilder on iOS 26 Beta (cycling workout), I noticed the following behavior:
– Starting a cycling HKWorkoutSession automatically connects to my Bluetooth heart rate monitor and records HR into HealthKit ✅
– However, my Bluetooth cycling power meter and cadence sensor (standard BLE Cycling Power & CSC services) are not connected automatically, and no data is recorded into HealthKit ❌
On Apple Watch, when starting a cycling workout, these sensors do connect automatically and their data is written to HealthKit — which is exactly what I would expect on iOS as well.
Question:
Is this by design, or is support for power and cadence sensors planned for iOS in the same way as on watchOS?
Or do we, as developers, need to implement the BLE Cycling Power and CSC profiles ourselves (via CoreBluetooth) if we want these metrics?
Environment:
– iOS 26 Beta
– HKWorkoutSession & HKLiveWorkoutBuilder (cycling)
– Bluetooth HRM connects automatically
– BLE power & cadence sensors do not
This feature would make it much easier to develop cycling apps with full HealthKit integration, and also create a more consistent user experience compared to watchOS.
Thanks for any insights!
Topic:
App & System Services
SubTopic:
Hardware
Tags:
Health and Fitness
HealthKit
Core Bluetooth
WorkoutKit
I am developing an app that communicates with external BLE device over GATT. The device has a secure-read characteristic exposing some of it's data and requires pairing/bonding in order to communicate with it.
I was able to pair and connect with the device using AccessorySetupKit and .bluetoothPairingLE option:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, descriptor: descriptor)
In this case when setting up accessory, I was prompted to compare passkeys and after confirming I can read the characteristic etc.
Then I tried adding .confirmAuthorization to picker item and problems started:
let descriptor = ASDiscoveryDescriptor()
descriptor.bluetoothServiceUUID = CBUUID(string: serviceUUID)
descriptor.supportedOptions = [.bluetoothPairingLE]
let picketItem = ASPickerDisplayItem(name: name, productImage: image, desc
pickerItem.setupOptions = [.confirmAuthorization]
When setting up, I can see a passkey to be confirmed, but when confirmed the setup ui get's suck in loading state. Under the hood in logs, I can see that my app has connected to peripheral and was able to read the characteristic.
I am unsure why the ui is stuck in loading state in this case. What is the difference when using .confirmAuthorization option and what should be the proper flow of events to setup accessory and then access protoected characteristic?
Dear Apple Developer / MFi Program Support,
I am exploring technical possibilities for screen sharing and remote interaction between iOS devices and external hardware (e.g., embedded systems, in-vehicle systems) for a prototype we are currently developing.
I have reviewed the public iOS developer documentation, but I would appreciate your guidance and clarification on the following advanced use cases, particularly in the context of MFi or enterprise-level integrations:
Full-Screen Sharing of iOS Device
Is it possible to mirror or stream the entire iOS screen, even when the app is running in the background or not in the foreground?
Does ReplayKit or any other framework under the MFi or enterprise entitlements allow full-device screen capture outside the app context?
Remote Touch Injection and Control
Is there any officially supported mechanism, under MFi or otherwise, that allows external systems to remotely control an iOS device’s touch interface (e.g., simulate gestures, taps, swipes)?
Are any of the following permitted under special entitlements:
Access to IOHIDEventSystem or similar private APIs for input injection?
Communication over USB or network to relay control commands that simulate direct user interaction?
Hardware-Level Integration and Entitlements
Does the MFi Program allow:
Use of private frameworks or entitlements to build low-level integrations for iOS device control or mirroring?
Communication over USB/Lightning/USB-C to enable bi-directional interaction (streaming out, commands in)?
What are the specific APIs or entitlements available under MFi that enable these use cases?
Can you provide references to documentation, SDKs, or prerequisites for companies seeking such capabilities?
Eligibility and Certification Process
What are the criteria to be approved for the MFi program with access to such advanced capabilities?
Can PoC or early-stage research prototypes be eligible, or is MFi access restricted to commercial production intent?
How long does it typically take to gain access to these entitlements (assuming NDA and certification requirements are met)?
Alternative Pathways
If MFi access is not feasible in the short term, is there any Apple-supported alternative path (e.g., test device provisioning, enterprise signing, custom profiles) that permits more advanced capabilities for prototyping purposes?
We are not looking to publish this as a general App Store app at this stage, but rather to demonstrate feasibility as part of an innovation prototype that may lead to further OEM-level engagement in the future.
Thank you for your support and guidance.
Best regards,
Topic:
App & System Services
SubTopic:
Hardware