[-] [email protected] 3 points 7 hours ago
def generate_proof_of_work_key(initial_key, time_seconds):
    proof_key = initial_key
    end_time = time.time() + time_seconds
    iterations = 0
    while time.time() < end_time:
        proof_key = scrypt(proof_key, salt=b'', N=SCRYPT_N, r=SCRYPT_R, p=SCRYPT_P, key_len=SCRYPT_KEY_LEN)
        iterations += 1
    print(f"Proof-of-work iterations (save this): {iterations}")
    return proof_key


def generate_proof_of_work_key_decrypt(initial_key, iterations):
    proof_key = initial_key
    for _ in range(iterations):
        proof_key = scrypt(proof_key, salt=b'', N=SCRYPT_N, r=SCRYPT_R, p=SCRYPT_P, key_len=SCRYPT_KEY_LEN)
    return proof_key

The first function is used during the encryption process, and the while loop clearly runs until the specified time duration has elapsed. So encryption would take 5 days no matter how fast your computer is, and to decrypt it, you'd have to do the same number of iterations your computer managed to do in that time. So if you do the decryption on the same computer, you should get a similar time, but if you use a different computer that is faster at doing these operations, it will decrypt it faster.

[-] [email protected] 5 points 7 hours ago

It's a very short Python script and I'm confident I get the general idea - there's absolutely nothing related to current time in the decryption process. What they refer to as a "time lock" is just encrypting the key in a loop (so the encrypted key from one loop becomes the plain text for the next one) for the specified duration and then telling you how many iterations were done. That number then becomes a second part of the password - to decrypt, you simply provide the password and the number of iterations, nothing else matters.

[-] [email protected] 13 points 2 days ago

Yeah, even the TLDR makes it sound more like Qualcomm is yielding to the pressure from OEMs who want to be able to offer longer updates

[-] [email protected] 10 points 2 weeks ago* (last edited 2 weeks ago)

xrandr is Xorg only, it doesn't work with Wayland. You should be able to make SDDM use your Plasma display configuration - https://wiki.archlinux.org/title/SDDM#Match_Plasma_display_configuration

No clue if that's going to fix your issues, but at least it's supposed to work with Wayland.

[-] [email protected] 2 points 3 weeks ago

So I did look more into it, and apparently the open firmware is technically compatible with PCIe cards using this chip, but doesn't provide any advantages over just wiping the firmware and letting the chip default to its built-in fallback firmware, and so the maintainer doesn't see any value in explicitly supporting it.

Now the question is whether you consider the proprietary fallback firmware to be acceptable to run - this might sound weird, but for example FSF has explicitly made exceptions for devices with built-in firmware to be able to qualify for the Respects Your Freedom certification, so if your view aligns with theirs, you might consider this to be completely OK. If not, the free firmware appears to have a similar feature set, you'll just have to jump through more hoops.

Also do note that both the fallback firmware and the free firmware are missing many features of the proprietary firmware, so make sure to check it's not missing anything you need (wake on LAN, Jumbo frames and PXE boot seem like the most notable missing features to me).

More info on support for various PCIe cards

[-] [email protected] 4 points 3 weeks ago* (last edited 3 weeks ago)

It's less about the computer and more about the card itself - Talos II and Blackbird both use the BCM5719 chip for their integrated NICs. Basically, you're flashing part of the motherboard with this firmware. A PCIe card built around the same chip might connect the interfaces in a different way, and firmware doesn't generally have a way of poking around to find out how everything's set up from the hardware side of things - it needs to just know this, and that's why there are separate firmware builds for different hardware.

If you flash one of these files to that card, it might just so happen to work perfectly, but it most likely won't. You would need to figure out how it's wired up and modify the firmware with that knowledge. And then you could use the modified open firmware with that specific card model on any computer that supports the proprietary firmware, because IIUC this is meant to be functionally identical.

So in short, no, you cannot currently use this open firmware on any computer other than Talos II and Blackbird, but for slightly different reason than you might think.

[-] [email protected] 6 points 3 weeks ago

Also, the Arch repos are pretty much just an "AUR with binaries" - they contain the same PKGBUILD files used by AUR packages, because that's how Arch packages are built. So you can just download an Arch package PKGBUILD, modify it however you wish, and then build and install it.

[-] [email protected] 2 points 3 weeks ago

If it doesn't come at the expense of battery wear, then sure, lower charge time is just better. But that would make phone batteries the only batteries that don't get excessively stressed when fast charging. Yeah, phone manufacturers generally claim that fast charging is perfectly fine for the battery, but I'm not sure I believe them too much when battery degradation is one of the main reasons people buy new phones.

I have no clue how other manufacturers do it (so for all I know they could all be doing it right and actually use slow charging), but Google has a terrible implementation of battery conservation - Pixels just fast charge to 80%, then wait until some specific time before the alarm, then fast charge the rest. Compare that to a crappy Lenovo IdeaPad laptop I have that has a battery conservation feature that sets a charge limit AND a power limit (60% with 25W charging), because it wouldn't make sense to limit the charge and still use full 65W for charging.

[-] [email protected] 4 points 3 weeks ago

It doesn't slow charge, at least not on Pixel 7a. Well, you could argue whether 20W is slow charging, but it's all this phone can do.

It just charges normally to 80%, stops, and then resumes charging about an hour or two before the alarm. And last time I used it, it had a cool bug where if it fails to reach 80% by the point in time when it's supposed to resume charging, it will just stop charging no matter what the current charge level is. Since that experience, I just turned this feature off and charge it in whenever it starts running low.

[-] [email protected] 40 points 4 weeks ago* (last edited 4 weeks ago)

If it is an Arch-based distro (sorry, I don't recognize the package manager), then this might just be the recent Wine update that made it 700 MB smaller (which would mean the rest of your system grew 300 MB)

I made a post here about it: this one

Btw, is there a way to link to a post in a way that resolves on everyone's separate instance instead of hard coding it to my instance?

[-] [email protected] 4 points 4 weeks ago

Cheap Bluetooth might have connection hitches

Fair enough, but I've only ever seen this happen with cheap wireless cards / chipsets that do both Bluetooth and WiFi and don't properly avoid interference between these two (for example, I can get perfectly functioning Bluetooth audio out of my laptop with shitty Realtek wireless card if I completely disable WiFi (not just disconnect)). I think this is less of an issue for dedicated Bluetooth devices.

Bluetooth doesn't work with airplane mode although I think most airplanes these days aren't actually affected or we'd have planes dropping out if the sky daily.

Yeah, that's true. As for the second part, AFAIK there was never an issue with 2.4 GHz radios (which is the frequency band Bluetooth uses) interfering with planes, it was more of a liability / laws thing - the plane manufacturer never explicitly said that these radios are safe (so the airline just banned them to be safe) and/or laws didn't allow non-certified radios to operate on planes.

Also, does Bluetooth get saturated the way WiFi does?

Eventually yes, but it's much more resilient than WiFi - 2.4 GHz WiFi only has three non-overlapping channels to work with (and there's a whole thing with the in-between channels being even worse for everyone involved than everyone just using the same correct three channels that I won't get into), while Bluetooth slices the same spectrum into 79 fully usable channels. It also uses much lower transmission power, so signal travels a shorter distance. And unlike WiFi, it can dynamically migrate from channel to channel (in fact, it does this even without any interference). 100 people actually seeing each other's devices might be a problem, but I don't think that's a realistic scenario - Bluetooth will use the lowest transmit power at which it can get a reliable link, so if everyone's devices are only transmitting over a meter or so, there shouldn't be any noticeable interference on the other side of the plane.

[-] [email protected] 20 points 4 weeks ago* (last edited 4 weeks ago)

I don't really see the big problem here? Like sure, it's silly that it's cheaper to make wireless headphones than wired ones (I assume - the manufacturers are clearly not too bothered by trademarks and stuff if they put the Lightning logo on it so they wouldn't avoid wired solution just due to licensing fees), but what business does Apple have in cracking down on this? Other than the obvious issues with trademarks, but those would be present even if it were true wired earphones. It's just a knockoff manufacturer.

Cheapest possible wired earphones won't sound much better than the cheapest possible wireless ones, so sound quality probably isn't a factor. And on the plus side, you don't have multiple batteries to worry about, or you could do something funny, like plugging the earphones into a powerbank in your pocket and have a freak "hybrid" earphones with multi-day battery (they're not wireless, but also not tethered to your phone). On the other side, you do waste some power on the wireless link, which is not good for the environment in the long run (the batteries involved will see marginally more wear)

Honestly the biggest issue in my mind is forcing people to turn on Bluetooth, but I don't think this will change anyone's habits - people who don't know what Bluetooth is will definitely just leave it on anyway (it's the default state), and people technical enough to want to turn it off will recognize that there's something fishy about these earphones.

150
submitted 10 months ago by [email protected] to c/[email protected]
view more: next ›

Markaos

joined 1 year ago