[-] [email protected] 11 points 1 week ago* (last edited 1 week ago)

I use Obsidian, which is quite powerful with their vast plugin library. You can do a lot of automation, and you can check out some of Nicole van der Hoeven's videos, who among other things use it to keep track of TTRPG campaigns, both as a player and as a game master. For example this one.

I don't use their sync service, but have all files locally on my Nextcloud server. I sync them to my phone with Syncthing, which unfortunately means I cannot encrypt them with Cryptomator like I planned, but if you only use it on your computer, that is also something you could do. If you are paranoid about them still phoning home with your data, then you can block its network access with a firewall. I think you can install plugins manually.

I would have preferred it if it was FOSS. I have considered checking out Logseq as an alternative. But the bullet-based workflow doesn't appeal to me, so I haven't tried yet. I switched over from Standard Notes, and honestly it was pain to transfer because the text export from Standard Notes was all over the place, as I had used a lot of different note types. I tried to parse some of these smart notes they have, but I couldn't quickly figure out how they were structured to automate it, so I ended up manually going through and copying over what I wanted to keep. I like the approach of keeping plain text markdown files. It is easier to export to another application in the future, although some of the content will be useless as it is explicitly written for the plugins (e.g. Dataview).

[-] [email protected] 9 points 1 week ago

Nice, didn't know about Celeste. Will check it out :)

[-] [email protected] 37 points 1 week ago* (last edited 1 week ago)

You don't have to use all services. I have the Unlimited plan and use mail with custom domains (+ the included SimpleLogin account) and VPN mostly, and Drive for backup (no Linux client yet makes it a no-go for daily use, but I have my own Nextcloud server that serves my purpose fine). Pass I have not tried (I use another manager), and Calendar I also don't use.

I still feel I am getting my money's worth.

[-] [email protected] 1 points 1 week ago

And that neither F nor A = 0

[-] [email protected] 3 points 2 weeks ago

Real question: Is it not possible to install KDE, even though they do not provide an ISO with it?

[-] [email protected] 3 points 2 weeks ago

I've mostly been very satisfied with my InfinityBook 14 Gen7 that I got about 1.5 years ago. There have been some hardware issues (something wrong with the audio subboard that causes the sound from the speakers to go out once in a while, but they sent a new one that I haven't installed yet...). The mic is also not very good (some background noise), and the speakers when they work (which is most of the time) are also quite weak. I decided to spec it out as much as possible, and it does get hot under high loads, like gaming. The case is sleek, but perhaps a little flimsy?

But mostly it works perfectly fine, and it is such a great upgrade over my old MacBook that I finally get to do stuff on my computer now, and run into very few limitations (running newer games and other GPU-intensive tasks requiring more than 4 GB VRAM are the only things). Not to mention that I've had very good experience with their customer service when I n00b out and can't troubleshoot my way back.

[-] [email protected] 1 points 3 weeks ago

Is Kdenlive no good? Always heard good things, but I don't use those kinds of software.

[-] [email protected] 23 points 3 weeks ago

An alternative is to keep your eggs somewhat separated so that you don't end up in a locked in situation if their services deteriorate over the years, giving you an easier escape in that scenario.

[-] [email protected] 3 points 3 weeks ago

It is all free to use, but you will likely have some expenses with the self-hosting. If you do it yourself at home, you require hardware and power to run it on, and you would be well off having some additional backup solution off-site as well that would add to the cost. If you host on a VPS (like I do), you have the running costs of renting that server space.

[-] [email protected] 5 points 3 weeks ago

I use my self-hosted Nextcloud instance for this. Then sync to mobile using DAVx5. Calendar and contacts.

[-] [email protected] 1 points 4 weeks ago

What I really mean is: can I keep a backup of the game that I can play later without having to use their launcher?

[-] [email protected] 3 points 1 month ago

I recently deleted my Meta-account, and I hope they will be a thing of the past in the not too distant future. Zuck can get fucked.

5
submitted 1 month ago by [email protected] to c/[email protected]

I am currently in the process of finally getting rid of my Meta-account. In the process I have requested data extraction. The media stuff was made available pretty quickly, but the data logs are still being processed. Does anyone know what data they actually contain, and whether there's any point in waiting for it?

The reason I ask is that I also recently got a notification saying that will soon train their AI-model on my data which they will use the "legitimate interest" bullshit to do. I want to have my account deleted by the time this will be phased in (towards the end of June).

So now I am in the dilemma of waiting for the data logs to complete (which I don't know how long will take) or just delete my account in hopes that it will be purged before the AI-stuff goes into effect. I am unable to find out exactly what these data logs consists of and whether there is any point in keeping onto them for whatever reason.

Now, whether I can trust that they actually delete the data is another matter, but at least I would've done what I can, and they would break the law if the retain the data after my deletion request (under GDPR).

26
submitted 2 months ago by [email protected] to c/[email protected]

I have a specific issue I want to solve right now, but the topic is phrased more generally as I would love the answer to this as well. But this might be an XY-problem because of this, so here's the actual problem I want to solve:

I am using LibreWolf as my main browser, and it has WebGL disabled by default to avoid fingerprinting. I would like to keep it this way, but I am currently also making some internal tools for myself that requires WebGL (map renders with Plotly in Dash).

Is there a way to tell LibreWolf to enable WebGL only for specific sites, so that I don't have to manually toggle this when I want to look at my maps? My initial thought was that this could be solved with a site-specific about:config.

0
submitted 3 months ago by [email protected] to c/[email protected]

I have previously written a lot of code that is hosted on a public repo on GitHub, but it never had a license. It was written as part of my work while working for a non-commercial academic entity, and I would like to add a license before the link to the repo will be included in something that will be made public, potentially attracting one or two visitors.

This leaves me with a couple of questions:

  1. Can I just add a license after the fact and it will be valid for all prior work?
  2. Do I have to make sure the license is included in all branches of the repo, or does this not matter? There are for instance a couple of branches that are used to freeze the state of code at a certain time for reproducibility's sake (I know this could be solved in a better way, but that's how it is).
  3. I have myself reused some of the code in my current work for a commercial entity (internal analysis work, only distributed within the organization). Should this influence the type of license I choose? I am considering a GPL-license, but should I go with (what I believe to be) a more permissive license like MIT because of this?
1
submitted 10 months ago* (last edited 10 months ago) by [email protected] to c/[email protected]

For some time now I have been trying to clean up my digital footprint by requesting deletion of accounts and associated data for unused accounts, and being critical about which accounts I actually benefit from keeping. This turned out to be far more time consuming than I imagined beforehand.

I've been using a password manager for about a decade, so I have a fairly good overview of a lot of the accounts I've opened over the years. However, while privacy has always been important to me, I was more concerned with increasing governmental surveillance rather than corporate surveillance for many years. So over the years I've signed up uncritically to a large number of services. Most of these do not have much data about me, but my username has generally been reused, along with e-mail and sometimes phone number and other more sensitive data. This of course doesn't take into account all those minor services I've signed up for with e-mail + reused password. I have no control over those...

Now GDPR thankfully makes the job of cleaning up the accounts I do have control over a lot easier, because I doubt many of these services would even let me delete my account if not for it. However, it does not regulate enough how easy this process should be, and there are so many different ways companies implement this. From extremely convenient and easy ways of exporting all data and deleting the account, such as implemented by Strava (kudos to these companies!), to the worst offender of them all: British Airways... Until recently you would have to send an actual letter to their data protection offer with a copy of your passport (yeah right...). Sometime this year they've changed this, so now you just have to upload a picture of a letter to their document's portal, but since that is borked, I can't even access it to complete the deletion request. Apple also rejected my deletion request for an unknown reason, and I had to spend 45 minutes on the phone with them to understand that a cancelled, but still active subscription (a 1-year subscription that had not expired yet) from the app store, was blocking the deletion. Most are in between these two extremes, and either require that I actively follow up that I get a reply when I send an e-mail to their data protection officer with my request, or have processes that take up to a month to complete.

Of course, cleaning up 10-15 years of uncritical online presence would take a long time anyway, but companies making it hard on purpose to delete your account and data is infuriating, and a testament to a status quo that should burn in hell.

On the plus side: I no longer have accounts with Microsoft and Twitter, accounts with Apple and Amazon should soon be closed. My goal is to have completely phased out Meta and Google by the end of this year, although the communication lock-in of Meta and the fact that my primary e-mail was Gmail for 15 years (I've switched two years ago to Proton), makes these transitions a bit more difficult.

If nothing else, this process has made me very conscious about platform lock-in and the "joys" of ecosystems...

3
submitted 10 months ago by [email protected] to c/[email protected]

This is a question mostly for the sake of trying to learn more about how self-hosting works, and it is not vital that I resolve this. But if anyone wants to help me understand this, I would greatly appreciate it.

I have a media server running at home with certain Docker containers (Jellyfin, Navidrome and Audiobookshelf currently). I have not exposed these services to the internet, so they are currently only accessible on my home network, which is all I need for the time being. The server itself is connected to an external VPN provider as there may or may not be some torrenting involved at some point. Let's say the name of the server is mediaserver.

From my laptop connected to the same network, I can access all these services through http://mediaserver.local: or http://:, while connected via the same VPN provider on the laptop also. On my cell phone (running CalyxOS), I am unable to do so. I need to disable VPN in order to access the services.

What is the difference between my laptop connected via VPN and my phone doing the same thing, both connected to my home network. I didn't actually think the VPN would come in to play before making requests outside my home network, but that's probably just me being ignorant.

40
submitted 10 months ago by [email protected] to c/[email protected]

I've been self-hosting Nextcloud for sometime on Linode. At some point in the not too distant future, I plan on hosting it locally on a server in my home as I would like to save on the money I spend on hosting. I find the use of Nextcloud to suit my needs perfectly, and would like to continue using the service.

However, I am not so knowledgeable when it comes to security, and I'm not too sure whether I have done sufficient to secure my instance against potential attacks, and what additional things I should consider when moving the hosting from a VPS to my own server. So that's where I am hoping from some input from this community. Wherever it shines through that I have no idea what I'm talking about, please let me know. I have no reason to believe that I am being specifically targeted, but I do store sensitive things there that could potentially compromise my security elsewhere.

Here is the basic gist of my setup:

  • My Linode account has a strong password (>20 characters, randomly generated) and I have 2FA enabled. It required security questions to set up 2FA, but the answers are all random answers that has no relation to the question themselves.
  • I've disabled ssh login for root. I have instead a new user that is in the sudo usergroup with a custom name. This is also protected by a different, strong password. I imagine this makes automated brute-force attacks a lot more difficult.
  • I have set up fail2ban for sshd. Default settings.
  • I update the system at the latest bi-weekly.
  • Nextcloud is installed with the AIO Docker container. It gets a security rating A from the Nextcloud scan, and fails on not being on the latest patch level as these are released slower for the AIO container. However, updates for the container is applied automatically, and maintaining the container is a breeze (except for a couple of problems I had early on).
  • I have server-side encryption enabled. Not client-side as my impression is that the module is not working properly.
  • I have daily backups with borg. These are encrypted.
  • Images of the server are also daily backed up on Linode.
  • It is served by an Apache web server that is exposed to outside traffic with HTTPS with DNS records handled by Cloudflare.
  • I would've wanted to use a reverse proxy, but I did not figure out how to use it together with the Apache server. I have previously set up Nginx Reverse Proxy on a test server, but then I used a regular Docker image for Nextcloud, and not the AIO.
  • I don't use the server to host anything else.
42
submitted 10 months ago* (last edited 10 months ago) by [email protected] to c/[email protected]

I'm still a fairly new Linux-user (on Tuxedo OS), and I just ran into an issue that is new to me. If I try to update my system, either via command line or Discover, the apt update command fails. This is the output:

E: Could not get lock /var/lib/apt/lists/lock. It is held by process 1635 (apt-get)
N: Be aware that removing the lock file is not a solution and may break your system.
E: Unable to lock directory /var/lib/apt/lists/

Process 1635 is apt-get update run by root, and persists through restart. I am tempted to try to kill it (kill 1635), but I'm not sure if anything could break from that, so I thought I'd try to ask for help first before I do something stupid.

EDIT:

I have managed to update my system by killing the process, which releases the lock, and then going on to do normal sudo apt update and sudo apt upgrade. For the sake of troubleshooting, I tried to add back my third-party repos one by one, and none of them caused any problem. However, when rebooting the same issue as described above happens again. Software updates is set to "Manually" in the System settings.

In addition, everytime I ran sudo apt upgrade, at the end some update related to initramfs fails. My disk is encrypted using cryptsetup, and as I’ve come to understand, I should be very careful doing anything related to initramfs when that is the case. Here is the output:

Processing triggers for initramfs-tools (0.140ubuntu13.2) ...
update-initramfs: Generating /boot/initrd.img-6.2.0-10018-tuxedo
I: The initramfs will attempt to resume from /dev/dm-2
I: (/dev/mapper/system-swap)
I: Set the RESUME variable to override this.
zstd: error 25 : Write error : No space left on device (cannot write compressed block) 
E: mkinitramfs failure zstd -q -1 -T0 25
update-initramfs: failed for /boot/initrd.img-6.2.0-10018-tuxedo with 1.
dpkg: error processing package initramfs-tools (--configure):
 installed initramfs-tools package post-installation script subprocess returned error exit status 1
Errors were encountered while processing:
 initramfs-tools
E: Sub-process /usr/bin/dpkg returned an error code (1)

EDIT 2:

The issue seems to have been narrowed down to a failure of Tuxedo's driver configuration service that runs at boot. It is this process that calls apt-get (and something I should've seen earlier...), and systemctl status reveals some errors:

aug. 08 15:33:56 laptop systemd[1]: Starting Tomte-daemon, finishes tasks that could not be accomplished before...
aug. 08 15:34:06 laptop tuxedo-tomte[1393]: no network found!! some fixes might not be applied correctly
aug. 08 15:34:06 laptop tuxedo-tomte[1393]: systemctlCmd: systemd-run --on-active="30sec" tuxedo-tomte configure all >/dev/null 2>&1

I really appreciate the help from everyone so far. It's a good experience asking for help here, and I've learned a lot from your answers. Makes being a Linux newbie a lot easier. So thank you :)

Since this seems to be a very specific issue related to Tuxedo's own services, I will contact their support to get their input on what to do next.

view more: next ›

cyberwolfie

joined 11 months ago