this post was submitted on 15 Aug 2023
19 points (95.2% liked)

Linux

45479 readers
1273 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I've been playing with the largest models I can get running and have been using Librewolf or Firefox, but these use several gigabytes of system memory. What options exist that have less overhead? I'm mostly looking at maximizing the model training potential as I'm learning. The obvious solution is python in a terminal, but I need a hiking trail not free solo rock climbing.

all 7 comments
sorted by: hot top controversial new old
[–] [email protected] 9 points 10 months ago (1 children)
[–] [email protected] 6 points 10 months ago* (last edited 10 months ago) (1 children)

Absolutely right. I just tried it on the browsers installed on my system, loading this page:

Firefox: 560MiB
Epiphany (GNOME Web): 226MiB
elinks: 16MiB
lynx: 14MiB

Looks like lynx is the winner

(Sidenote: This isn't really a fair fight for Firefox since it's my daily driver, with extensions installed and a bunch of stuff cached. I'm guessing even a fresh install wouldn't get below 300MiB, though)

[–] [email protected] 7 points 10 months ago

Something webkit based probably. Gnome web is probably the most accessible of these.

[–] [email protected] 6 points 10 months ago* (last edited 10 months ago) (1 children)

Not what you're asking for, but how about putting the web browser and the page rendering on a different machine? This way your main machine can focus on calculating.

Edit: If the pages are super simple, there's "web browsers" which do work on the command line which can render simple pages in a very crude way.

[–] [email protected] 0 points 10 months ago* (last edited 10 months ago)

This is kind of what I was thinking. I have a $5/mo VPS running Selenoid. All it does is take incoming requests to simulate a real user clicking on some stuff.

Basically I run a website in Ruby on Rails that has to talk to some API’s. Unfortunately the industry that app works in is very behind with tech, so I make do with simulating a user visiting some portals en lieu of actual API calls. It’s great because the resource-constrained containers don’t have to power up an entire web browser in background jobs, though running these tasks as long-running background jobs presents other issues.

[–] [email protected] 1 points 10 months ago

There's a reason these browsers use that much memory. Something in living there and that's not just overhead. You can't realistically reduce that by a reasonable amount by just using another browser while retaining functionality.