Photos : Missing Faces on Apple AI

Well we all know that lots of applications now detect faces, which is cool, Google Photos are the leader in this market, then other companies have taken this on.

Apple in iOS 10 and Sierra made every "database" of your photos scan all your photos and I found they all found different faces and different people which in one word is useless.

My MacBook would detect some faces, which my iPhone would not detect but that would find other faces, the my iPad would the same again - a big usual Apple mess.

So in iOS 11 and High Sierra all that is "syned" with the iCloud, that well know reliable service full of bugs so lets look at what I am talking about, this is what Google Photos sees in the "Faces" view....

This is not all the people its sees and the names are not there for obvious reasons, but Google Photos sees 18 people in my photos with another 6 processing - all good here.



Right so Apple Photos only sees me and not all of me, where are the other 17 people Google Photos can see, which is more like 23 if you include the processing ones.

iCloud and data still seems to be a fail when Apple apply their "it just works" to iCloud as iCloud does everything that just work, kinda like the Titanic.

So, I though I could re-scan for photos, nope you cannot as Apple has removed this option from the GUI this is the version I am on which is the latest:



I therefore get worried that if the Sierra phomtos works and High Sierra does not, this close the GM and launch, are Apple one again expecting none beta testers to become beta tester after it goes out to their Production environment.