this post was submitted on 09 Aug 2023
278 points (100.0% liked)

Technology

37360 readers
327 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 42 points 11 months ago* (last edited 11 months ago) (3 children)

It’s not turning copyright law on its head, in fact asserting that copyright needs to be expanded to cover training a data set IS turning it on its head. This is not a reproduction of the original work, its learning about that work and and making a transformative use from it. An generative work using a trained dataset isn’t copying the original, its learning about the relationships that original has to the other pieces in the data set.

[–] [email protected] 12 points 11 months ago (2 children)

The lines between learning and copying are being blurred with AI. Imagine if you could replay a movie any time you like in your head just from watching it once. Current copyright law wasn’t written with that in mind. It’s going to be interesting how this goes.

[–] [email protected] 7 points 11 months ago (3 children)

Imagine being able to recall the important parts of a movie, it's overall feel, and significant themes and attributes after only watching it one time.

That's significantly closer to what current AI models do. It's not copyright infringement that there are significant chunks of some movies that I can play back in my head precisely. First because memory being owned by someone else is a horrifying thought, and second because it's not a distributable copy.

[–] [email protected] 6 points 11 months ago (1 children)

the thought of human memory being owned is horrifying. We’re talking about AI. This is a paradigm shift. New laws are inevitable. Do we want AI to be able to replicate small creators work and ruin their chances at profitability? If we aren’t careful, we are looking at yet another extinction wave where only the richest who can afford the AI can make anything. I don’t think it’s hyperbole to be concerned.

[–] [email protected] 1 points 11 months ago

The question to me is how you define what the AI is doing in a way that isn't hilariously overbroad to the point of saying "Disney can copyright the style of having big eyes and ears", or "computers can't analyze images".

Any law expanding copyright protections will be 90% used by large IP holders to prevent small creators from doing anything.

What exactly should be protected that isn't?

[–] [email protected] 2 points 11 months ago

How many movies are based on each other? It's a lot, even if it's just loosely based on it. If you stopped allowing that then you would run out of new things to do.

[–] [email protected] 0 points 11 months ago* (last edited 11 months ago) (1 children)

my head [...] not a distributable copy.

There has been an interesting counter-proposal to that: make all copies "non-distributable" by replacing the 1:1 copying, by AI:AI learning, so the new AI would never have a 1:1 copy of the original.

It's in part embodied in the concept of "perishable software", where instead of having a 1:1 copy of an OS installed on your smartphone/PC, a neural network hardware would "learn how to be a smartphone/PC".

Reinstalling, would mean "killing" the previous software, and training the device again.

[–] [email protected] 1 points 11 months ago (1 children)

Right, because the cool part of upgrading your phone is trying to make it feel like its your phone, from scratch. Perishable software is anything but desirable, unless you enjoy having the very air you breathe sold to you.

[–] [email protected] 0 points 11 months ago (1 children)

Well, depends on desirable "by whom".

Imagine being a phone manufacturer and having all your users running a black box only you have the means to re-flash or upgrade, with software developers having to go through you so you can train users' phones to "behave like they have the software installed"

It's a dictatorial phone manufacturer's wet dream.

[–] [email protected] 1 points 10 months ago

Yes, that's exactly my problem with it.

[–] [email protected] 0 points 11 months ago* (last edited 11 months ago) (2 children)

Imagine if you could replay a movie any time you like in your head just from watching it once.

Two points:

  1. These AIs can't do that; they need thousands or millions of repetitions to "learn" the movie, and every time they "replay" the movie it is different from the original.

  2. "learning by rote" is something fleshbags can do, and are actually required to by most education systems.

So either humans have been breaking the copyright all this time, or the machines aren't breaking it either.

[–] [email protected] 1 points 11 months ago (1 children)

You have one brain. You could have as many instances of AI as you can afford. In a general sense, it’s different, and acting like it’s not is going to hit you like a freight train if you don’t prepare for it.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago) (1 children)

That's a different goalpost. I get the difference between 8 billion brains, and 8 billion instances of the same AI. That has nothing to do with whether there is a difference in copyright infringement, though.

If you want another goalpost, that IMHO is more interesting: let's discuss the difference between 8 billion brains with up to 100 years life experience each, vs. just a million copies of an AI with the experience of all human knowledge each.

[–] [email protected] 0 points 11 months ago (1 children)

It’s all theoretical at this stage, but like everything else that society waits until it’s too late for, I think it’s reasonable to be cautious and not just let AI go unregulated.

[–] [email protected] 0 points 11 months ago (1 children)

It's not reasonable to regulate stuff before it gets developed. Regulation means establishing some limits and controls on something, which can't be reasonably defined before that "something" even exists, much less tested or decided whether the regulation has whatever desired effects it intends.

For what is worth, a "theoretical regulation" already exists: it's the Asimov's Rules of Robotics. Turns out current AIs are not robots, and that regulation is nonsense when applied to stable diffusion or LLMs.

[–] [email protected] 1 points 11 months ago

I disagree. Over the last twenty years or so we have plenty examples of things they should have been regulated from the start that weren’t, and now it’s very difficult to do so. Every “gig economy” business for example.

[–] [email protected] 0 points 11 months ago

Well fleshbags have to pay several years worth of salary to get their education, so by your comparison, Google's AI should too.

[–] [email protected] 9 points 11 months ago (3 children)

This is artificial pseudointelligence, not a person. It doesn't learn about or transform anything.

[–] [email protected] 5 points 11 months ago* (last edited 11 months ago)

Im not the one anthropomorphising the technology here.

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

To take those statements seriously, you will need to:

  • define and describe in detail the processes by which "a person" learns
  • define and describe in detail how "a person" transforms anything
  • define and describe in detail what is "intelligence"
  • define and describe in detail what these "artificial paeudointelligences" are doing
  • define and describe in detail the differences between the latter and the previous points

Otherwise, I'll claim that "a person" is running exactly the same processes (neural networks, LLMs, hallucinations), and that calling these AIs "artificial paeudointelligences" is nothing else than dehumanizing a minority just because you feel threatened by them.

[–] [email protected] 1 points 11 months ago* (last edited 3 weeks ago)

spoilerasdfasdfsadfasfasdf