this post was submitted on 18 Jul 2024
492 points (98.4% liked)

Memes

45187 readers
1496 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 47 points 2 months ago* (last edited 2 months ago) (15 children)

Technically possible with a small enough model to work from. It's going to be pretty shit, but "working".

Now, if we were to go further down in scale, I'm curious how/if a 700MB CD version would work.

Or how many 1.44MB floppies you would need for the actual program and smallest viable model.

[–] [email protected] 16 points 2 months ago (3 children)

squints

That says , "PHILLIPS DVD+R"

So we're looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>

[–] [email protected] 13 points 2 months ago (1 children)

llama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.

[–] [email protected] 1 points 2 months ago (1 children)

Just interested in the topic did you 🔨 offline privately?

[–] [email protected] 1 points 2 months ago

I'm not an expert on them or anything, but feel free

load more comments (1 replies)
load more comments (12 replies)