But thats not at all what is happening, it's not analyzing images at all.
Step 1) Apple is installing a list of 1's and 0's on your phone that match the "finger print" of known CSAM (Child Sexual Abuse Material). This list is enclosed within iOS.
Step 2) When you upload a photo to icloud, your iPhone (On Device) it is checking to see if the 1's and 0's of the pictures you are about to upload match the set of known porn 1's and 0's that are stored on your phone. If they do when your phone sends the picture to the cloud it also sends a "flag"
Step 3) if you have more than 30 flags in your iCloud account, then some sort of notification can occur. Apple could adjust this number to 1 I suppose for arguments sake, but it is still a upload of a picture of known CSAM. Apple doesn't even know what 30 images you have, just that you have more than 30.
In short, Apple still sees NOTHING stored on your phone.