182
submitted 2 weeks ago by [email protected] to c/[email protected]

I've watched the keynote and read some stuff on the internet and I've found this video about a dude talking about the new update (I linked it here because if you didn't see the keynote, this is probably enough)

Is it just me, or... does no one address that Apple does a Microsoft move by basically scanning everything on every machine and feeding this into their LLM?

(page 2) 29 comments
sorted by: hot top controversial new old
[-] [email protected] 3 points 2 weeks ago

because its target audience thinks it is a feature?

[-] [email protected] 2 points 2 weeks ago

Here is an alternative Piped link(s):

https://piped.video/watch?v=B_SqrNMPV5c&pp=ygUMTWFjIG9zIHNwaWVz

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[-] [email protected] 1 points 2 weeks ago* (last edited 2 weeks ago)

So I’m curious . . . what reference am I missing that helps me understand what menu settings cause exactly which pieces of personal data to be shared with which Apple services? I want to RTFM, and while I appreciate people wanting to be helpful, comment replies are not themselves documentation.

(I switched from Android to ios in 2020 and haven’t really figured out details beyond turning icloud sync off for specific apps. I’d like to add more devices and learn to trust that sync method but I don’t understand where crypto is used and how the keys are handled.)

[-] [email protected] 2 points 2 weeks ago

Everything is encrypted with iCloud except for email and something else that’s obviously not encrypted that I can’t fucking remember.

iCloud encryption can be defeated with a server side key that’s used by Apple if you need to recover your account (so like you get your account hijacked or forget your password or something). Apple can be compelled by subpoena, like any other company, to provide the contents of your iCloud because they have this capability.

If you don’t like that, you can turn on advanced data protection, which deletes their server side key, generates new keys and re encrypts everything after you write down your special alphanumeric key without which your iCloud contents are inaccessible.

The security checkup in settings will let you figure out who has access to what.

[-] [email protected] 1 points 2 weeks ago
[-] [email protected] -2 points 2 weeks ago
load more comments (2 replies)
[-] [email protected] 1 points 2 weeks ago

..because that would just affect apple users, and not me. I don't chose to use windows, I am forced, so I hate when they take away my choice of keeping it out of my stuff

[-] [email protected] -2 points 2 weeks ago

Apple has its llm's running locally. Thats a huge win I hope that sets a standard i really hope someone make an android app thats foss that can do something simmillar.

[-] [email protected] 2 points 2 weeks ago

Microsoft's models run locally too. It doesn't make a difference, because they are shipping everything you do home.

[-] [email protected] 0 points 2 weeks ago

Does copilot use chatgpt thats not local. And thats why we need a good foss ai ecosystem.

[-] [email protected] 2 points 2 weeks ago

The requirement for Recall is a neural coprocessor with substantial performance, specifically to be able to run the model locally.

[-] [email protected] -5 points 2 weeks ago
  • usually people got Microsoft license for free (by working in company, and company buy you enterprise license)
  • But with apple, people usually get it by passion.
[-] [email protected] -5 points 2 weeks ago

It's a very simple answer Apple has guaranteed that your data will stay on your device and stay secure. This is generally trusted because Apple has a track record of keeping user data secure on the device or encrypted in the cloud even in ways Apple cannot access. Point is, when Apple says they are going to do this in a way that respects privacy, and they outline the technical details of how it will work, people trust that because there's a track record.

Microsoft has no such trust. They have a recent track record of being intrusive and using dark patterns to persuade users to give Microsoft their data, for example in Edge there have been new feature pop-ups that require data sharing with Microsoft and the two options are 'got it' and 'settings' so accepting requires one click and rejecting requires 4 going into the settings menu and changing a few things. Microsoft is also heavily pushing Copilot which is mostly cloud-based. Furthermore, Microsoft recently showed a system that would basically screenshot your computer at very regular intervals and store them in an insecure manner. Granted it was on the device, but the way they were going to be stored meant they could be stolen with two lines of code. And let's not forget that Windows 11 cannot be set up without a Microsoft account, so to even use your computer you have to share your email address with Microsoft. In this and many other ways they just do not act like a company that respects privacy at all, they act like the typical big tech give us everything or we will make your life difficult type company that nobody trusts.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›
this post was submitted on 14 Jun 2024
182 points (88.2% liked)

Asklemmy

42460 readers
2056 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS