Apple doing it right, IMO

Trust with customers is important (Obvs). It’s been interesting watching some backlash from the technically minded early adopters on recent things. Word gets out. It’s that old adage/myth from the 70s that a bad experience will be told to at least 9-10 others (now multiplied by how many social media followers or Substack readers they have).

Adobe is rowing back like Pinsent/Redgrave at the Olympics with regard to using data to train their AI

Adobe overhauls terms of service to say it won’t train AI on customers’ work
The company is trying to win back trust after last week’s backlash.

But the damage is done. Creatives were already very annoyed about it.

Microsoft? Still quiet about Recall. I’m not sure sticking their proverbial head in the sand and hoping it goes away is going to work

A PR disaster: Microsoft has lost trust with its users, and Windows Recall is the straw that broke the camel’s back
The world is up-in-arms over Windows Recall, but why? It stems from Microsoft’s seeming lack of care for Windows and its users.

But they need to get this response right.

And then Apple release ‘Apple Intelligence’ and it is a LESSON in secure private cloud computing. The article linked below is fascinating. Proof is in the pudding of course. But it seems very, very well done:

Blog - Private Cloud Compute: A new frontier for AI privacy in the cloud - Apple Security Research
Secure and private AI processing in the cloud poses a formidable new challenge. To support advanced features of Apple Intelligence with larger foundation models, we created Private Cloud Compute (PCC), a groundbreaking cloud intelligence system designed specifically for private AI processing. Built with custom Apple silicon and a hardened operating system, Private Cloud Compute extends the industry-leading security and privacy of Apple devices into the cloud, making sure that personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple. We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale.
“But this last requirement, verifiable transparency, goes one step further and does away with the hypothetical: security researchers must be able to verify the security and privacy guarantees of Private Cloud Compute, and they must be able to verify that the software that’s running in the PCC production environment is the same as the software they inspected when verifying the guarantees.”

They’re still taking heat. Especially from folk beginning to understand the copyright and data issues (and after that awful ad). But this is a good direction to take. Apple had to do something with AI. The markets would have murdered them otherwise - and business is business.

Subscribe to Gary P Shewan

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe