macOS 13.1: mediaanalysisd has nothing to do with the CSAM scan

After installing macOS 13.1, several Mac users reported that the system suddenly tried to connect to an Apple server using a "mediaanalysisd" process. Firewall apps like Little Snitch reported the corresponding network traffic. Due to last year's discussion surrounding the CSAM (CSAM = Child Sexual Abuse Material) scan to find material showing child sexual abuse, mediaanalysisd has been implicated - despite Apple's announcement that it will not roll out the feature. And so there was a big misunderstanding.

There are currently some reports about the mediaanalysisd process under macOS 13.1 on the Apple Mac. Contrary to a widely shared blog post, this has nothing to do with the CSAM scan.
There are currently some reports about the mediaanalysisd process under macOS 13.1 on the Apple Mac. Contrary to a widely shared blog post, this has nothing to do with the CSAM scan.

Rumors fueled about secret launch of CSAM scan

One does not want to accuse the critics of the CSAM scan of illegal actions, but rather the motives of data protection and the protection of privacy. Because it is assumed by some people, who are now often quoted, especially in the current discussion about mediaanalysisd, that images and their content end up at Apple as a mere copy. And that could also apply to your own nude pictures, if you have them. 

However, there is no such transmission. This rumor was started by connecting two unrelated topics. Images are still not sent to Apple, and nudity is still not explicitly searched for. The only thing that happens – as has been the case for years – is image analysis for object recognition. And that is clearly communicated.

What is the mediaanalysisd process anyway?

As the name suggests, mediaanalysisd is used for media analysis, i.e. for classifying local image content. This has been used under this name for years, both under macOS and under iOS and iPadOS. It is used to assign people, objects, animals, plants, art and the like so that you can find the respective pictures in the Photos app or start a search in online encyclopedias from there. 

Plants, trees, dog breeds and the like can be identified in this way. Furthermore, the text recognition "Live Text" and the recognition of QR codes are based on this process. There are discussions, especially about the energy consumption of bugged versions since 2020 in Apple's user forum and already since seven years in Apple's developer forum. It is therefore not a completely newly introduced system process with new tasks.

How is image content securely analyzed and transmitted?

No detailed image descriptions with all small and large details are passed on to the Apple servers. And a copy of the images as a file is certainly not sent. Hash codes are used; So-called neural hashes are used for the purpose of image recognition and the smart functions for processing this recognition. 

How it all works, how the image analysis works in detail and further information can be found in the article "Is Apple checking images we view in the Finder?', published by The Electic Light Company on January 18, 2023. There is also an explanatory diagram in case the text is too long.

Has mediaanalysisd extended access to images under macOS 13.1?

So now we know that mediaanalysisd is a process that has been used for years, through which no image content is transmitted one-to-one, but only encrypted data values ​​are sent. These are used for object recognition as well as looking up image content, identifying QR codes and extracting text and translating it. 

All functions that have been clearly communicated and used by many for years. This has nothing to do with the CSAM scan. It was only associated with it because it was a big topic that received media attention recently. However, the fact that the media analysis process now also transmits to Apple servers, even if the Photos app etc. is not being used, could simply be a bug.

Then z. B. in this Twitter thread received. It provides a technical analysis of the process. Screenshots of this analysis are shared so you can see for yourself (a thing the folks who associate mediaanalysisd and CSAM scan don't do). This proves that algorithms are used for object recognition. 

Extracted lists of possible image content (animals) are also shown. Furthermore, it is pointed out that the technology is used to recognize QR codes. The sixth tweet shows the decrypted network request, which is essentially empty. This suggests that the expansion of mediaanalysisd network requests on macOS 13.1 is a bug. "The network call is clearly a bug", it is even said.

Conclusion on confusing image recognition with the CSAM scan

There's a blog post doing the rounds right now, cited by some who don't do research and prefer to share accusations rather than facts. Since I do not want to offer the author a platform, I will not link this blog post here. In short, it's about the fact that he doesn't use Apple Photos, iCloud or other Apple services, but found the mediaanalysisd process under macOS via Little Snitch Firewall and drew his own conclusions from it. Namely that Apple is now scanning images for child sexual abuse and could therefore hoard nude images and create a backdoor for authorities. But these statements are unfounded. And that's what the sources linked above show.

Did you like the article and did the instructions on the blog help you? Then I would be happy if you the blog via a Steady Membership or at Patreon would support.

1 comment on "macOS 13.1: mediaanalysisd has nothing to do with the CSAM scan"

  1. It's exactly the same functionality, just for a different purpose. NN hashes plus network. This is inherently a privacy disaster, like Siri. These are not “classic” hash functions such as SHA.

Post a comment

Your e-mail address will not be published. Required fields are marked with * marked

In the Sir Apfelot Blog you will find advice, instructions and reviews on Apple products such as the iPhone, iPad, Apple Watch, AirPods, iMac, Mac Pro, Mac Mini and Mac Studio.