FlowVoid

joined 1 year ago
[–] [email protected] 3 points 9 months ago* (last edited 9 months ago)

Sure, but the point is they have transponders. And pilots generally use them (because it's safer) unless they have a good reason not to.

[–] [email protected] 3 points 9 months ago

Stealth aircraft spend a lot more time flying training missions over friendly territory than combat missions over enemy territory. They use transponders on training missions, such as this one, because they want to be easily visible to other military and civilian pilots.

[–] [email protected] 3 points 9 months ago
[–] [email protected] 10 points 9 months ago* (last edited 9 months ago) (2 children)

F35s have transponders, just like every other aircraft that flies in the US. They are necessary to avoid mid-air collisions. When flying a stealth mission in enemy airspace, they can turn the transponders off.

Unfortunately, the transponder on this particular F35 is not working.

[–] [email protected] 4 points 9 months ago* (last edited 9 months ago)

A key issue, often overlooked, is that US law imposes significant restrictions on the export and sale of military hardware.

Starlink is currently not considered military hardware. SpaceX is desperately trying to keep it that way, their ultimate goal is to sell subscriptions to civilians. Thus they get anxious when it is openly used for military purposes.

In this regard Starlink is somewhat similar to civilian GPS receivers, which automatically shut down at 1200 mph so they can't be used in missiles.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago) (2 children)

If we had a policy of nationalizing contracted military infrastructure, then nobody would make a contract with the military.

And while this may sound good to some, it sure wouldn't be in Ukraine's interest. Unreliable Starlink access is better than no Starlink access at all.

[–] [email protected] 1 points 10 months ago

Ok, but "It would be great if people had to buy more of the thing" is not an accurate summary either. Putting a CD drive on a console does not mean you have to buy physical media.

[–] [email protected] 0 points 10 months ago* (last edited 10 months ago) (5 children)

There are situations where permission is not required to use copyrighted material, mainly "fair use". But AI training often does not qualify as fair use.

Otherwise, intellectual property is treated similarly to other types of property. For example, the person who owns a car can give you permission to use it. That doesn't mean you can do whatever you want with it. The owner gets to set the rules. They aren't "kings", but as owners they do have near complete control over what you can do with their car.

When you upload something to social media, you (the content owner) give the host permission to display your content. That does not mean users who view your content have permission to do whatever they want with it.

There is plenty of open source code posted into repositories that are extensively mirrored, yet the code has lengthy conditions attached to it. If you use it in violation of the stated license, you could find yourself on the losing end of a lawsuit.

There are plenty of photographs posted onto Instagram, which is also designed to display them to anyone who asks. If a professional photographer finds that you've used one of their Instagram photos without permission, you could find yourself on the losing end of a lawsuit.

And the Fediverse may be a non-commercial decentralized platform, but copyright protection doesn't magically disappear here. You give servers a license to display what you wrote, but you may reserve the same rights over your IP as software developers and photographers do over their own.

[–] [email protected] 0 points 10 months ago* (last edited 10 months ago) (7 children)

Copyright holders can prohibit all use of their work. If they can prohibit all use, then they can prohibit any specific use.

And they rarely give a wide-ranging "permission for their posts to be viewed by the public." That's like saying "I can do whatever I want with this source code, the developer gave permission for the code to be viewed by the public." The legal language is often far more specific. There may be prohibitions on commercial usage, requirement for attribution, or even requirements that future code be licensed in a similar manner. There is almost always fine print when content appears to be "free".

Of course, it's possible that you could find a creative way around the legalese. Pointing a camera at a browser may work, until the fine print changes to prohibit that too. But anyway, that's not what AI developers have done in the past. So one way or another, they may be forced to change their ways.

Hollywood and other big businesses will still find a way to train AI as usual. They are already used to paying musicians when a song is used in a movie. They can easily pay copyright holders for a license to use content as training data. It's far safer - and more efficient - to pay up than try to get around the rules with a camera pointed at a screen. As a bonus, content creators who contribute training data may benefit from royalties.

Nevertheless, I think it will become more difficult for people who think they can easily get "free" training data from the web, just like 20 years ago when people thought they could easily get "free" music from Napster.

[–] [email protected] 1 points 10 months ago* (last edited 10 months ago) (9 children)

That's a public performance, which is a form of redistribution. That's not relevant to AI training.

Copyright law defines whether or not you can make a copy of a work. The person who owns the copyright can deny permission to make any copies, or grant you to make a permission to make a copy only under certain conditions. Those conditions are completely up to the copyright holder. They might prohibit public performance, but by no means is public performance the only thing that the copyright holder can prohibit. It's simply a very common prohibition.

You are trying to trying to generalize from a specific right, viewing the content on a browser, to a general right to "look" at the content, to the right to train an AI. But legally those are not the same at all. You may be granted some, all, or none of those rights.

Suppose you are in a modern art gallery. You have been given the right to "look" at someone's art. You can nevertheless be prohibited from making a photograph of the art, even if the camera is also "looking" at it. The owner of the art can attach whatever conditions they want to your photo, including how long you can keep it and exactly what you do with it.

For example you could be allowed to photograph the art for home use but not for wider distribution. You could be allowed to photograph the art for classroom use, but not for AI training. If you are not willing to follow all of the conditions, then you can't make a photo of the art at all.

The same is true of text. Websites give permission to make a copy of their text for use on your browser. And they can set whatever rules they like for how else your copy may be used.

Except for being banned from using public data that non-American AIs are able to use.

Sure. Of course, America could also ban those non-American AIs from being used in the US. Just as America bans other products that infringe patents/IP.

[–] [email protected] 0 points 10 months ago* (last edited 10 months ago) (11 children)

Which is way more than what an AI model retains.

It makes no difference what the AI model retains. The only question is whether you had permission to use your copy in the manner that you did.

So for instance suppose you made a copy of a Disney movie in any fashion (by torrent, by videotaping a screening, by screen-capturing Disney+, etc), then showed it to a classroom in its entirety, and then deleted it immediately thereafter. You infringed copyright, because you did not have permission to use it in that manner even once. It makes no difference how long you retained your copy.

Note that it would also make no difference if there were actually no students in the classroom. Or if the students were actually robots. Or just one robot, or a software AI. Or if you didn't use a screen to show the material, you simply sent the file electronically to the AI. Or if the AI deleted the file shortly after receiving it. You still didn't have permission to use your copy in the manner you did, even once. Which means it was illegal.

America's laws are not global laws.

True. But the GDPR has shown us that a country can take measures to protect its data globally.

If they wish to ban AI training this will become starkly apparent.

In every other field, researchers have long been required to use opt-in databases for their work. They can't just "scrape" your medical records without your consent in order to study a particular disease. That would be wildly unethical.

Yet research, including AI research, has thrived in the US even with such ethical requirements. I am confident future AI researchers in America can be both ethical and successful.

view more: next ›