Kerfuffle

joined 1 year ago
[–] [email protected] 2 points 10 months ago (6 children)

The solution may be stupid but it is an actual problem. Do I really want to make an extra dish to clean just to have a place to put the spoon? That's annoying.

[–] [email protected] 2 points 10 months ago (1 children)

I don’t know if nostupidquestions would’ve accepted the question (they tend to reject questions aimed at viewer experience)

Some might take the fact that it's too stupid for nostupidquestions as a sign. :)

Humorous hyperbole or not, it's still an unanswerable question. How can you interpret it? "Which do you personally prefer?" - who really cares whether some random person prefers chocolate over vanilla and if you want to know what's common it's easy to find surveys. "If you prefer one, why?" - uhh, "it tastes better to me". What else can one really say?

At least something like "If you had to prove chocolate is better than vanilla or vanilla is better than chocolate, is there a food or recipe you'd use?" I didn't take the time to make it sound like a decent title, but at least that general approach might encourage interesting discussion, sharing foods people not have been aware of already, recipes, etc. There's nowhere really to go with this post though.

[–] [email protected] 3 points 10 months ago (3 children)

I don’t know why this was downvoted so much.

It's a silly question that's impossible to answer. For subjective preferences like flavors, none are "objectively better". They also could have posted this in "no stupid questions" which at least invites questions that are apparently dumb, but they didn't.

I didn't downvote it personally, but why it got downvoted certainly isn't a mystery.

[–] [email protected] 1 points 10 months ago

What's wrong with sea lions?

[–] [email protected] 3 points 10 months ago

The graph actually looks like it's saying the opposite. Fro most of the categories where there's actually a decent span of time, it climbs rapidly and then slows down/levels off considerably. It makes sense also: when new technology is discovered, a breakthrough is made, a field opens up there's going to be quite a bit of low-hanging fruit. So you get the initial step that wasn't possible before and people scramble to participate. After a while though, incremental improvements get harder and harder to find and implement.

I'm not expecting progress with AI to stop, I'm not even saying it won't be "rapid" but I do think we're going to progress for the LLM stuff slow down compared to the last year or so unless something crazy like the Singularity happens.

[–] [email protected] 2 points 11 months ago

It is only a matter of time before we’re running 40B+ parameters at home (casually).

I guess that's kind of my problem. :) With 64GB RAM you can run 40, 65, 70B parameter quantized models pretty casually. It's not super fast, but I don't really have a specific "use case" so something like 600ms/token is acceptable. That being the case, how do I get excited about a 7B or 13B? It would have to be doing something really special that even bigger models can't.

I assume they'll be working on a Vicuna-70B 1.5 based on LLaMA to so I'll definitely try that one out when it's released assuming it performs well.

[–] [email protected] 1 points 11 months ago (2 children)

Is anyone using these small models for anything? I feel like an LLM snob but I don't feel motivation to even look at anything less than 70-40B when it's possible to use those models.

[–] [email protected] 5 points 11 months ago

That seems like they left debugging code enabled/accessible.

No, this is actually a completely different type of problem. LLMs also aren't code, and they aren't manually configured/set up/written by humans. In fact, we kind of don't really know what's going on internally when performing inference with an LLM.

The actual software side of it is more like a video player that "plays" the LLM.

[–] [email protected] 8 points 11 months ago

By "attack" they mean "jailbreak". It's also nothing like a buffer overflow.

The article is interesting though and the approach to generating these jailbreak prompts is creative. It looks a bit similar to the unspeakable tokens thing: https://www.vice.com/en/article/epzyva/ai-chatgpt-tokens-words-break-reddit

[–] [email protected] 6 points 11 months ago

A more restrictive license wouldn’t help in that case.

Well, it depends. Elsewhere in the thread, people mentioned licenses that have ethics clauses:

  1. https://firstdonoharm.dev/
  2. https://anticapitalist.software/

How enforceable (or whether I would actually have the resources to do something) these are is another problem, but it still might give some entity pause. Just generally though, using a restrictive license like GPL is pretty likely to make Puppy Punching Worldwide Inc look for other alternatives as well. Odds are, their puppy punching software isn't going to be compatible with a license like the GPL.

[–] [email protected] 1 points 11 months ago

whoever budges and makes way for the other male has just become the submissive male.

For someone who actually thinks like that, being insecure about how someone else might think they're the "submissive male" makes them the "submissive male". "Real men" aren't going to care what some random person who will never affect them thinks, especially if it's silly stuff like this.

Alpha posturing aside, a lot of people make kinds of unnecessary sacrifices out of insecurity also. It's good to start trying to develop the habit of catching yourself and doing whatever you wanted to in the first place regardless of what people think (within limits, obviously).

[–] [email protected] 9 points 11 months ago (2 children)

I usually use MIT, partially because my current interests (AI/LLM stuff) involve interfacing with some other projects that are MIT and partially because it's just a simple "do whatever" license and I don't really care to enforce terms. Of course, if I thought some government or company was going to use stuff I develop to launch the nukes or control a robot fist to punch cute little puppies right in the snout then I'd start using a more restrictive license but the odds of that are... pretty much nonexistent for everything I've ever created.

view more: ‹ prev next ›