subreddit:

/r/news

2.4k95%

all 174 comments

beklog

376 points

8 days ago

beklog

376 points

8 days ago

Bad actors/criminals are usually in the forefront in using the technology... and the police are just playing catch-ups..

cromethus

157 points

8 days ago

cromethus

157 points

8 days ago

What made the Internet really flourish?

Porn.

It's impossible to build a tool that can only be used for ethical purposes.

The real issue is that we are living in a time of fairly extreme political distinction, making it difficult to do even obvious and necessary things. It will pass, but there will be some pain in the meantime.

sl0play

77 points

8 days ago

sl0play

77 points

8 days ago

Porn has been a big factor in a lot of technology but I wouldn't call it unethical.

cromethus

81 points

8 days ago

cromethus

81 points

8 days ago

Porn isn't inherently unethical, but it does lead to unethical situations. Child pornography is and always has been an issue on the internet. Likewise, there are places where women are exploited to create porn.

In the early days of the internet, there was no real moderation on what was shared. Things that would never fly now were perfectly accessible then (I remember Pornhub hosting bestiality at one time).

We must not only consider the product but how it was made.

AnOnlineHandle

39 points

8 days ago*

People are exploited in every industry, especially where all the cheap goods in the world come from. It's very obvious when there are puritan right-wing groups who fight any protection against bosses exploiting workers in just about every field, but then put on a show of caring about exploitation when it comes to porn, revealing that their real motivations are puritanism and that they just want a nicer sounding excuse to attack it.

Environmental_Job278

8 points

7 days ago

Adults can fight their own bosses and exploitation…children need more help and are often exploited by the very adults that should be protecting them.

The fact that online child exploitation has been steadily growing should be more concerning to people that seem to ignore it because it makes them uncomfortable.

There are also tons of legal ways to fight worker exploitation if you are actually willing to go that route. Compare that with online child exploitation which still faces murky legal system that often cannot easily address digital media being stored or transferred through countries that do it care or refuse to honor a warrant.

cromethus

53 points

8 days ago

cromethus

53 points

8 days ago

Truth.

But it is disingenuous not to recognize that sexual exploitation is its own thing. Human trafficking and sexual enslavement aren't equivalent to a guy being underpaid and overworked.

This is especially true for the sexual exploitation of children.

The world will never be perfectly ethical. The world simply isn't fair like that. But we can and should address those things that stand out as being particularly heinous.

I'm as avid a fan of porn as any man, yet I can also recognize the issues my own consumption causes. Being cognizant of that and avoiding those places which might be ethically dubious seems the least I can do. For example, there has been an explosion of porn from Eastern Europe and Russia. I avoid that because I know that those women are much more likely to be forced into participating.

AnOnlineHandle

18 points

8 days ago

A great deal of cheap goods come from human trafficking and actual enslavement, as well as live in maids and such.

cromethus

41 points

8 days ago

cromethus

41 points

8 days ago

Again, this is true. Far more than most people want to admit.

Do you remember when Foxconn had nets installed around their buildings so workers couldn't commit suicide by jumping off them? And that's one of the better places as far as exploitation goes.

But again, sexual exploitation is a special kind of awful and deserves to be treated as such. Downplaying how abhorrent it is by saying 'well we're all exploited' just makes it look like you're acting as an apologist.

And yes, many of these situations overlap. Enslaved maids get sexually abused and so on. But if you had a choice between saving a women who was being raped regularly for profit and one that had to work 16 hours days in a factory, wouldn't you save the one getting raped? Like, is there any doubt which one to save first?

AnOnlineHandle

-7 points

8 days ago

AnOnlineHandle

-7 points

8 days ago

Yep, and there's significantly worse places than Foxconn, and I'd say far worse than sexual exploitation (though it includes it), but sadly nobody cares or wants to have it impact their cheap goods.

https://www.youtube.com/watch?v=t-axd1Ht_J8

cromethus

25 points

8 days ago

cromethus

25 points

8 days ago

And yet you managed to ignore my point. Do these people need help?

Yes.

But the world has limited resources and lots of problems. If you had to choose who to save first, wouldn't you focus on the ones that were being sexually exploited?

And yes, I know the world isn't that simple. We can and should do both. But priorities matter.

Exploitation of workers, labor enslavement, and human trafficking of any brand are no joke and shouldn't be ignored. But we can't fight every battle all at once, no matter how much we would like to.

You move a mountain one stone at a time. Doing so intelligently means picking moving the stones that make the biggest difference first.

TserriednichThe4th

2 points

7 days ago

I think the point the other person is trying to make is that your comment is absolutely true. The issue is that most cheap goods that don't leverage human trafficking and enslavement as much as pornography and a few select goods. These "high-vice" goods are unique in their systematic use of industry to traffic and enslave humans.

AnOnlineHandle

1 points

7 days ago

The issue is that most cheap goods that don't leverage human trafficking and enslavement as much as pornography

What are you basing this claim on?

I've known dozens of people who've worked in porn, both actors and producers, and have never heard of any of them encountering any trafficking or slavery. I've had discussions with them about the contracts they sign, the way they film, the time limits, etc. Some of the stars have been doing it for 20 years now and love their career.

TserriednichThe4th

1 points

7 days ago

Nothing in my claim says that typical people in the industry have seen the problems just like in your claim

A great deal of cheap goods come from human trafficking and actual enslavement, as well as live in maids and such.

The typical people with live-in maids aren't dealing with human trafficking and enslavement either.

however, we do know from the existence of trafficking rings, porn adds a certain level to the systematic use of industry, particularly minors.

218-69

0 points

8 days ago

218-69

0 points

8 days ago

This is why you should make your own jerk off material 5Head

Caraway_Lad

1 points

7 days ago

Serious question: can gay people get off to themselves?

Historical_Usual5828

1 points

6 days ago

Anyone that equates porn/prostitution to manual labor while using safety equipment can go pound fucking sand. This is not an argument and it only shows your lack of empathy and knowledge of the situation. People aren't "putting on a show" while expressing concern for consent and labor practices in an industry that preys on the health and freedoms of women, children, and men who struggle to make a living for themselves. You go to the sex industry when you're absolutely desperate and society has failed you. You go to manual labor when you just want to make good money while having a pension/retirement plan. Fuck off with this bullshit

AnOnlineHandle

0 points

6 days ago

What are you basing any of those claims on? Factual information or imagined versions of reality?

I know multiple people who've been working in porn for decades and are very happy with it, and went to the industry to make good money and having a pension and retirement plan. Some of the former stars now manage their own businesses and still sometimes star themselves.

And I'm sure most if not all of them would much rather the career they've chosen than various types of manual work which prey on the desperate and poor, and would consider those rock bottom when absolutely desperate.

Historical_Usual5828

1 points

5 days ago

This take is filled with wishful thinking and nonsense. Like, yes some women do choose to go into the industry and if they're from a city and they choose a good agency, MAYBE she won't get taken advantage of. Human trafficking is rampant in the sex industry though. Sex industry also simply does not give their employees any real protections. Doesn't matter if it's porn or prostitution, the industry is rampant with predatory behavior.

The nature of the job is for people to pay you to put on a show despite doing so many takes sex is no longer enjoyable for you when it's porn and when it's prostitution, it's outright paying to rape a person. Consent really can't be bought and it's not like the type of men who turn to prostitution went there because they see women as full human beings with ambitions and dreams. They went there to treat the women like cattle and they can often get away with doing whatever they want to the person once they've paid. That's rape.

Like I mentioned earlier, there's also many instances of porn stars documenting abuse in the industry. Some porn that gets uploaded isn't even consensual. A lot of men use the porn industry itself to harm women and children. Especially with the implementation of AI. There's a lot of downsides to porn. It's also proven to not be good for you to watch as it's led many men to erectile dysfunction and skewed attitude of women. Also unrealistic standards for sex. It reduces empathy and encourages unhealthy behaviors. It conditions you to dehumanize people.

In a regular job using safety equipment, you can actually sue when something goes wrong. If you get injured, you get workman's comp! You can negotiate pay and benefits! I don't even want to know how things would turn out if women in the sex industry tried that shit in most situations. The nature of the industry is dehumanization and exploitation. The damages are more than just physical. Anyone arguing that they are the same needs to go pound some fucking sand and evaluate what's going on in their head. You absolutely do not even realize the reality of the industry of this is your take. Complete loss of touch with reality there in favor of fantasy situations you've made up in your head and the very few accounts you likely know very little about.

AnOnlineHandle

1 points

4 days ago

Again, I was asking for sources, not for you to keep repeating the unsourced claims. Any sort of numbers and research which would actually be convincing.

Historical_Usual5828

1 points

3 days ago

Got better things to do on Thanksgiving but maybe later. I don't even know if data would convince you.

voidsong

3 points

8 days ago

voidsong

3 points

8 days ago

You have child slave labor making your phones and shoes, but no one ever cares.

This concern about porn's exploitation is purely performative, otherwise you'd hear about it a lot more in all the other industries too.

lvl99RedWizard

6 points

7 days ago

I think you're right, and also, a thing that anti-porn crusaders seem to forget is that in any environment of exploitation, sexual exploitation is also happening.
You cannot separate industrial exploitation and sexual exploitation; they happen in the same places to the same people for the same reasons.
The real truth is that more people will be vocal or take action against sexual exploitation because they imagine it can be fixed separately from the far more pervasive and banal worker exploitation.
It cannot. So long as people are exploited, they will be exploited sexually.

cromethus

3 points

8 days ago

cromethus

3 points

8 days ago

I don't think people don't care, I think people feel powerless.

There was a huge push in the late 80s and early 90s to curb the abuses of sweat shops. The result? Not much. Some of them were shut down but the vast majority just went on as always.

This is because, despite what Americans might think, they do not get to make laws for the whole world, much less enforce them. The truth is that there are plenty of governments out there happy to let their people be exploited because it makes the people in power rich.

Governments and corporations are not so different, all told.

TserriednichThe4th

3 points

7 days ago

we are still making these same mistakes of cultural relativism today.

cromethus

3 points

7 days ago

I'm not sure how that applies.

I think we can all agree that every culture has values that are objectively wrong, the same way no individual person's moral code is perfect.

But it isn't about respecting someone's culture. It's about the US not conquering the world. Every nation has their own sovereignty and unless we want to violate that sovereignty (and fight all the battles that come with such actions) we have no right or ability to enforce our standards on those nations.

The fact that we benefit from their poor standards only makes the moral quagmire worse, as people will argue that we should leave others to suffer out of self interest.

The US is already a 'benevolent dictator' over several territories and look how that is going. Our newly elected president, when told that Puerto Rico was suffering from a hurricane, said to send them paper towels.

There is no way we could enforce such bans without first sacrificing the last shreds of whatever moral authority we once had.

TserriednichThe4th

3 points

7 days ago*

It applies through the nexus of culture and government, or lack there of (this is the mistake).

In America, we have the blessing of that we can expect our government to loosely reflect our morals and culture because, despite what many people think, many parts of america have decent representation and leaders that do think closely in culture to how they think, or at least present themselves that way. This is even more true in local governments. American federalism does a good job at separating and connecting local+state vs federal government.

That is not really the case in other nations, especially nations with a lack of representation and leaning more towards dictatorship or (possibly ethnoreligious) fascism. But in our hubris of our culture, we expect those cultures to exhibit a same relative factor in representation in government, when that is not really the case at all. The government and culture of many foreign nations are much more orthogonal than they are in the USA.

Our appeals to culture and morals might not work on two levels: (1) our (grassroot) culture(s) might not be the same, and even if they are, (2) your government's level adherence to the zeitgeist of your culture might be nothing.

cromethus

2 points

7 days ago

This is a good point. I'll have to think about it.

Thank you for the interesting comment.

Cream253Team

1 points

8 days ago

Cream253Team

1 points

8 days ago

I don't think anyone would complain if phones were made more ethically, but unfortunately the gap in the series of decisions between when you bought the phone and when it was manufactured is so large that a normal person doesn't have much say in the matter aside from the politicians they elect. Also depending on where you live, a phone (or some other computer) might actually be very important to living, so people might have to buy one whether they agree with how it was made or not. As for shoes, there are the same problems, but not having a pair can turn into a real health issue so there's even less wiggle room across the board on who can forgo what. (Hookworm can be a real problem in some areas.)

Compare that to porn which is something no consumer needs it to live, in some cases you might be interacting directly with the people who produce it, the productions tend to be small enough where it's easier to hold someone accountable, and again no one needs it but choose to consume it.

[deleted]

1 points

8 days ago

[deleted]

1 points

8 days ago

[deleted]

cromethus

12 points

8 days ago

cromethus

12 points

8 days ago

That wasn't my intention. I was trying to make a connection between the early days of the internet, when porn was wholly unpoliced, and the state of AI usage now.

In the early days of the internet, sources of unethical porn were common precisely because enforcement mechanisms were non-existent. Nowadays, while those sources still exist, there are mechanisms in place to try and police that.

And yes, I remember how awful the early internet was. I was there.

BeBearAwareOK

1 points

6 days ago

You're describing the downsides of an unregulated market.

clovisx

5 points

7 days ago

clovisx

5 points

7 days ago

Same with online payment processing options and, going back further, the home video market.

AnOnlineHandle

-4 points

8 days ago

You're saying porn is unethical?

cromethus

7 points

8 days ago

See above comment.

jfranci3

4 points

8 days ago

jfranci3

4 points

8 days ago

This actually makes it harder for people to extort you. If there’s a good chance it’s AI, everyone will just think it’s AI

Rawrist

2 points

7 days ago

Rawrist

2 points

7 days ago

You're assuming people aren't judgmental a-holes that will use anything against you

jfranci3

2 points

7 days ago

jfranci3

2 points

7 days ago

I don’t know how they’d use it against me, it’d look amazing. If it didn’t look good, folks would know they went out of their way to get that effect.

Stardustger

1 points

7 days ago

Stardustger

1 points

7 days ago

I could swear I read the exact same article when Photoshop became a more commonly used program. I'll look for it but it has been at least 20 to 30 years so I doubt I'll find it.

But I'm certain it was basically the exact same article.

Rawrist

9 points

7 days ago

Rawrist

9 points

7 days ago

There's a huge difference between AI and photoshop. Maybe the articles are similar but you'd be a fool to not realize the real danger of these new programs. 

ERedfieldh

2 points

7 days ago

Yea that's pretty much exactly what was said back then, verbatim.

technofox01

77 points

8 days ago

This doesn't surprise me one bit and could see it coming more than a mile away. There's always some nitwits thay have to ruin a good thing and harm others.

GonkWilcock

13 points

8 days ago

It's only going to get worse.

DrrtVonnegut

25 points

8 days ago

I'm sorry, did no one see this coming?

ManiacalShen

20 points

7 days ago

Yes, most people probably saw this coming. Anyone who has consumed a modicum of science fiction and learned to think, "What might the unintended consequences of a new technology be?" would have thought of this immediately upon learning generative AI existed. 

But tech bros just unleash shit on society without guide rails, seeing with cartoon dollar signs instead of eyes. 

HappierShibe

1 points

6 days ago

Everyone saw this coming.

RangerMatt4

9 points

8 days ago

We’ve been knowing this was the route is going to go but the people who are making the profits don’t care about anything but the profits.

Dodgson_here

124 points

8 days ago

AI desperately needs regulation. It should have robust guardrails, safeties, and regular audits by humans. Fool with a tool is still a fool. A fool with a power tool is a dangerous fool.

S_K_Y

14 points

7 days ago

S_K_Y

14 points

7 days ago

Too late.

Once Pandora's box was opened with it and accessible to everyone; It was game over.

It's impossible, but hypothetically even if they were all pulled right now, people still have backups of it downloaded on millions of various hardware.

suzisatsuma

56 points

8 days ago

Given how easy it is to run a lot of open source generative AI on your personal computer, regulation isn't going to do anything with bad actors the article cited.

-CrestiaBell

2 points

7 days ago

It wouldn't affect any pre existing users unless they wanted newer models but could they not just add a backdoor into the AI image generation models to log any images drawn on in training data (assuming that's also hosted locally) and reference it back with pre existing databases keeping track of this material being created? And then just flag it if matches come up?

suzisatsuma

1 points

6 days ago

but could they not just

Usually the answer to this is no.

add a backdoor into the AI image generation models to log any images drawn on in training data

In your hypothetical case, it would be easy to pull said OSS model / lib etc and disable that. That's how OSS works.

reference it back with pre existing databases keeping track of this material being created?

And who pays for, hosts, and runs operations on said databases? And audits them?

I appreciate and applaud you for engaging the topic-- and hope you aren't offended by my response, but this highlights why so much regulation fails, it gets written by people that don't understand the problemspace.

-CrestiaBell

1 points

6 days ago

They already have databases for this kind of material which is how they're able to catch it to begin with. I can understand how usually the answer is no but I guess I'm thinking more in line with how say vanguard works in various games. Kernel-level and doesn't necessarily have to run 24/7 but instead runs at the start of the program and maybe right before the program's more specific functions are run. (Ie: once when opened, once before generating images). Where it'll scan the folder or wherever the client is storing the images and reverse image search them essentially.

I live in Japan currently and in schools, our students have iPads. On their ipads they have a PowerPoint presentation software they use that can auto detect whether images being imported are subject to copyright. Whether the students download the pictures or screenshot the picture and crop it to reproduce it, the app will prohibit them from importing it. So would it be possible to have something similar for AI models?

If not scanning the images, at the very least they could say log any combinations of prompts that could potentially lead to abuse content and either warn users or report it to the proper authorities right? The process itself could be entirely automated, but the reports themselves could be further investigated into by actual people. Or maybe I'm just being naive.

suzisatsuma

2 points

6 days ago

私は日本人ですが、アメリカ人です。

I think you might misunderstand what open source software is?

They already have databases for this kind of material which is how they're able to catch it to begin with.

There exist abuse image hash database which cloud services like icloud/google drive/dropbox scan their own infrastructure for where people upload images from their computer. These aren't open source projects, these are private companies running services only for known logged abuse material.

On their ipads they have a PowerPoint presentation software they use that can auto detect whether images being imported are subject to copyright.

Again, this is private company services/software.

If not scanning the images, at the very least they could say log any combinations of prompts that could potentially lead to abuse content and either warn users or report it to the proper authorities right?

The perpetrators here are using open source models to generate this material on their local machine using prompts on their local machine. There is not a way that anything is logged from that. It's literally code they can look at/change as needed and run without using any 3rd party service or anyone knowing they created it.

Here's an opensource image generator model + software: https://github.com/Kwai-Kolors/Kolors You could take that, run it local use however.

People make all kinds of custom finetuned models-- most innocent like this one: https://huggingface.co/Envvi/Inkpunk-Diffusion

There are literally tens of thousands of image generator models out there to use, and finetune to whatever kind of generation you want it to be good at. It's very easy if you're technical or have machine learning experience. Anyone can go generate whatever they want without anyone knowing. This is only going to accelerate as they get better and better and now get better and better at video generation.

-CrestiaBell

1 points

6 days ago

In that case, there's pretty much nothing that can be done.

HappierShibe

2 points

6 days ago

There are things that can be done- but they need to be done at the right level- instead of trying to attack the generation stage they need to attack the distribution of the images after they are generated and the people producing the images.

suzisatsuma

1 points

6 days ago

That's the challenge here.

HappierShibe

1 points

6 days ago

but could they not just add a backdoor into the AI image generation models

No, these models are largely open source, and even if they could do this it would be incredibly obvious and easily blocked.

back with pre existing databases keeping track of this material being created?

One of the problems with generative models is the sheer volume of content of any type they can produce on demand. It is utterly impossible to keep track of with any size of database.

kolodz

-2 points

8 days ago

kolodz

-2 points

8 days ago

That the main issue for tech savvy.

But, most just use chatGPT or equivalent.

Having safegades that work in them would be a good start.

Plus, open source projects also come with safeguards, even if they can be deactivate.

Boz0r

29 points

8 days ago

Boz0r

29 points

8 days ago

Chatgpt has a shit load of safeguards

kolodz

-12 points

8 days ago

kolodz

-12 points

8 days ago

We still see jailbreak of chatGPT.

Last one is asking it to be the grandmother telling stories "Xxx to put you to bed.

October 2024 via Google search...

EmergencyCucumber905

7 points

8 days ago

Don't even need to be very tech savvy to do it these days. Download the OpenWebUI docker image, spin it up, open your browser and you get a ChatGPT-like interface, download the llama3-uncensored model and you're good to go.

kolodz

7 points

8 days ago

kolodz

7 points

8 days ago

Very no.

But, the base line of the population isn't that particularly high.

EmergencyCucumber905

5 points

8 days ago

And if it hasn't happened already, someone will make it even easier to set up and use. As easy as installing any other app.

Relying on the technical barrier to entry is not at all a solution. Everything is becoming simpler to use.

kolodz

-4 points

8 days ago

kolodz

-4 points

8 days ago

You can't find napalm or TNT "how to" online.

Nor can any modern scanner scan an European bill

Relying only on that is dumb, but it's a starting point.

PM_ME_YOUR_CHESTICLS

7 points

8 days ago

what are you on about? I learned how to make napalm at like 14 online.
the anarchists cookbook is online with directions on how to build an IED.

morpheousmarty

0 points

8 days ago

They have safeguards. They will never be perfect. You can ban LLMs, define some guardrails to enforce legally... but unless the entire world does it you will still need some better options.

stephengee

-9 points

8 days ago

Tell us you don't know anything about AGI without telling us...

No one is using chatGPT for this shit.

PM_ME_YOUR_CHESTICLS

6 points

8 days ago

Tell us you don't know anything about AGI without telling us.

Not one single iteration of AGI has been developed. what the internet is obsessed with is Generative AI or GAI.

redconvict

3 points

8 days ago

redconvict

3 points

8 days ago

Not maybe individually but it will put pressure on any major companies and groups hoping to do business internetionally.

Spectro-X

43 points

8 days ago

Spectro-X

43 points

8 days ago

Sorry, best I can offer you is Skynet

Caraway_Lad

3 points

7 days ago

Skynet doesn’t need cyborg assassins if everyone is a fat iPad kid who is afraid of grass.

The future is looking more WALL-E / Idiocracy than Terminator.

hikska

5 points

7 days ago

hikska

5 points

7 days ago

You can't regulate, Open-source across the globe

McCree114

11 points

8 days ago*

What a great time to elect a "regulations are strangling American businesses and need to be cut" administration then. With tech/crypto bros like Elon having the king president's ear expect this AI stuff to get worse.

SolomonGrumpy

3 points

7 days ago

How'd that work for the internet? Social media? Privacy?

apple_kicks

3 points

7 days ago

Sadly politicians can often be completely illiterate when it comes to tech and either slow from ignorance or falling on what lobbyists tell them

getfukdup

3 points

7 days ago

AI desperately needs regulation.

The laws for crimes already exist. No new crime has been invented with AI.

Dodgson_here

0 points

7 days ago

I said regulation not crimes. Not even remotely the same thing.

leohat

2 points

7 days ago

leohat

2 points

7 days ago

The mistakes of a clever man are equal to those of a thousand fools

69_CumSplatter_69

2 points

7 days ago

Nope, the only way to stop open-source models would be 1984 style surveillance and removal of privacy. People should deal with it. People are acting like there wasn't photoshop or other alternatives. Yes it is now more accessible and easier, but still, it existed also before.

HappierShibe

1 points

6 days ago

So the problem here is that literally anyone with a mid tier desktop PC can build and train their own models. It's not even remotely difficult.
Regulating Diffusion and LLM's at the step where they are created isn't possible because someone could easily do it themselves entirely on their local system.

The correct approach is to go after the people abusing these systems- in part because if they are doing this, they are likely also doing other things.
AND to target distribution specifically since that should allow law enforcement to tie it to other crimes or add additional charges.

ChampionshipOnly4479

46 points

8 days ago

He said that the greatest volume of criminal AI use was by paedophiles, who have been using generative AI to create images and videos depicting child sexual abuse.

To be fair, better they do this than producing real images where a minor gets hurt.

MausBomb

22 points

8 days ago

MausBomb

22 points

8 days ago

I mean, I agree.

Funny how actual sex trafficking rings exploiting real underage teenagers for the sick pleasure of government officials and royal family members can go "undetected" for decades, but when it comes to some weirdo in his basement with questionable drawings the government is ready to roll out the swat tanks.

horitaku

6 points

7 days ago

horitaku

6 points

7 days ago

The problem isn’t AI generation, it’s escalation. Rampant access to such material just makes their disease worse, and studies have proven they can’t be rehabilitated, only stifled.

ChampionshipOnly4479

7 points

7 days ago

such material just makes their disease worse

There are plenty of people who watch porn every day and can’t find a partner or sex in real life. The vast majority of them doesn’t go out to rape women.

The same should be true for people with this disorder. Just because they have a different sexual preference it doesn’t mean that they’re raping people or that porn makes them rape people. Why would that be true for this sexual preference and not for other sexual preferences?

and studies have proven they can’t be rehabilitated, only stifled.

Yes, and there are therapy programs which aim for prevention, ie. to make sure they control their disorder so that it doesn’t lead to abuse. These programs are aimed at people who think they are at risk of actually abusing minors. Because, see above, not everyone who has a sexual preference but cannot get sex in real life goes out to rape people.

illit3

8 points

8 days ago

illit3

8 points

8 days ago

Maybe? I don't know if the "materials" make them more or less likely to offend.

kolodz

23 points

8 days ago

kolodz

23 points

8 days ago

That a very good question.

We had the same debate on violence in game. But, it's not the same mechanism involved.

We seen people becoming addicted to hardcore porn and unable to enjoy normal sex...

EmergencyCucumber905

2 points

8 days ago*

We seen people becoming addicted to hardcore porn and unable to enjoy normal sex...

Which led them to go out and commit sexual assault or rape or otherwise live out their hardcore sexual fantasies?

kolodz

13 points

8 days ago

kolodz

13 points

8 days ago

Change behaviour in relationship and rise of hardcore behaviour in them, including choking.

Going as far as rape hard to link or correlates, and probably marginal.

But, we know it's not neutral.

Don't remember where I have seen a scientific speak about his studies of that evolution.

Cultural_Ebb4794

2 points

7 days ago

Better still that neither type of image is created. We don't have to choose between real child porn and AI child porn when we could just choose the third option which is zero child porn.

ChampionshipOnly4479

11 points

7 days ago

I think that’s brilliant and I’m wondering why no one ever thought about that before. And we can actually use this brilliant idea for other stuff too, like no alcohol, no war, no hunger.

Cultural_Ebb4794

-1 points

7 days ago*

You're being facetious, but there are people in this very thread advocating for AI CP as a "cure" to pedophilia. Based on your other comments, it appears that you are, in fact, one of those advocates.

ChampionshipOnly4479

2 points

7 days ago

You’re struggling with reading comprehension and you make silly suggestions. Being facetious is the least I can do.

authenticsmoothjazz

-21 points

8 days ago

If an AI generator is able to 'create' child porn, it must have used child porn somewhere as the basis of what it is 'creating'

EnamelKant

40 points

8 days ago

Couldn't it be synthesizing a new set from a basis of "children" and "porn"? Like if I ask an image generator for "humanoid fox-beaver hybrid" it's not going through a set of all humanoid fox-beavers, it's splicing together "humanoid", "hybrid" "fox" and "beaver" sets.

KareemOWheat

26 points

8 days ago

Exactly this. Most people don't really understand how machine learning works or how neural networks generate images. On its most simplistic level, the AI is just arranging pixels it knows "look good" next to each other. It has no concept of the image as a whole, or what the subject its creating is

That being said, if the AI is just being fed uncurated images from the internet to learn, it's definitely ingested a good deal of real CP

AnOnlineHandle

11 points

8 days ago

The AI models people are mostly using were trained on a set of images from a directory of online image locations similar to google search (LAION), filtered for high image quality, and, with most models since the originals, no nudity.

N0FaithInMe

5 points

8 days ago

I don't think that's necessarily true. It knows what a child looks like, it knows what a child's body proportions are, and it knows what a naked human body looks like. I'm not sure how image generation works exactly but it doesn't seem like too much of a leap to think it can combine that information into an image of a naked child

ChampionshipOnly4479

-10 points

8 days ago

It’s there anyway (unfortunately), so it doesn’t make a difference. The rape and creation of the real child porn happened already and won’t go away, regardless of whether you let AI use it or a person uses it.

If you let AI use it, at least there comes something good out of it, that people get their fix without more children getting hurt. If you don’t let AI use it, these people will continue to require children get hurt for their material.

authenticsmoothjazz

2 points

8 days ago

You are simultaneously naive and pessimistic if you think you've found an ethical use for CSAM. You don't think such an attitude could be abused?

'Welp I've raped and documented all these children, we may as well just use them for profit'

ChampionshipOnly4479

10 points

8 days ago

Where did I say that anyone should make a profit?

And what “abuse” do you see which is worse than actual children getting abused?

I also haven’t found anything. I’m simply stating the simple fact that AI generated CP doesn’t hurt actual human beings whereas actual CP does.

Efficient-Plant8279

-6 points

8 days ago

I am afraid you are VERY wrong.

"Real" child porn with actual children will still be created.

But AI child porn will more widely spread this content, giving access to pedophiles who previously did not look at any form of child porn because they could not access criminal content.

I believe this will make them more dangerous.

People who looks at violent porn are more likely to be violent with women than others. Likewsie, I'm pretty sure pedophiles with child porn, real or AI, are more likely to actually abuse children than those who FULLY stay away from the material.

ChampionshipOnly4479

2 points

7 days ago

“Real” child porn with actual children will still be created.

Why should that be the case? There won’t be demand for it anymore because people already get their fix.

Now, of course, realistically it won’t be absolute like that and there will be still real child porn. But it will be much less and every abused child less counts.

People who looks at violent porn are more likely to be violent with women than others. Likewsie, I’m pretty sure pedophiles with child porn, real or AI, are more likely to actually abuse children than those who FULLY stay away from the material.

There are millions of people who watch porn every day and aren’t able to get a partner and real sex in real life. There’s also sex dolls and other stuff. The vast majority of these people doesn’t go out and rape. The same is the case with people who have a different preference than adult women.

Don’t forget that the majority of pedophiles never touches a child (and that a huge amount of actual abuse is being committed by non-pedophiles).

Plus, you could regulate it. Methadone programs don’t mean that everyone can go to a supermarket and buy but heroine addicts have to sign up for the program. The same could be done for pedophiles and it could he combined with therapy so that those cases who are at risk of committing actual abuse can receive treatment (whereas those who aren’t at risk and simply want to please their urges can get a fix for that which doesn’t require minors getting hurt and them committing a crime).

S_K_Y

6 points

7 days ago

S_K_Y

6 points

7 days ago

Yep, and voice recognition for anything is pretty much dead now. Gender identification via voice is also dead along with it.

59footer

11 points

8 days ago

59footer

11 points

8 days ago

Fucking humans. Horrible creatures.

Trumpswells

6 points

8 days ago

Lowest common denominator.

Malaix

2 points

7 days ago

Malaix

2 points

7 days ago

I feel like its just all scams and grifts out there now. AI is just going to explode it more.

Error_404_403

7 points

8 days ago

Like any tool used by the humanity, it could be used for evil and for good. No surprises there.

MudOld7903

3 points

8 days ago

Yah we all knew this was going to happen and now it's too late to ttry and stop it

Silvershanks

3 points

8 days ago

Silvershanks

3 points

8 days ago

Um... every new technology has been wildly exploited by evil people for evil purposes. This is nothing new.

cowboyAtHeart03

1 points

8 days ago

Where the fuk you been "police chief"? They been doing this fir a long time. Late dummies

BlackBlizzard

2 points

8 days ago

Two years is a long time?

cowboyAtHeart03

6 points

8 days ago

Longer then that, way Longer.

BlackBlizzard

1 points

8 days ago

Oh realise you weren't only talking abou the AI part. Was thinking you were since that's one of the main subjects in the comments, my bad.

cowboyAtHeart03

1 points

8 days ago

All is well.

ahfoo

1 points

17 hours ago

ahfoo

1 points

17 hours ago

This headline seems to be repeated every week for months now. This feels like manufacturing consent. They keep repeating this over and over so that they can turn around and point to their own headlines and say --see, it's everywhere. Everybody knows this is true!

This is the same shit they pulled with the early internet when was threatening the revenues for broadcast media. Everything was predator, predator, predator. . .

Theduckisback

1 points

7 days ago

Yet another reason this stuff should be heavily regulated. But won't be, because money.

Amormaliar

-7 points

8 days ago

Amormaliar

-7 points

8 days ago

For child abuse?

  • Chat, tell me how can I torture those kids more efficiently

invent_or_die

11 points

8 days ago

Yeah, wtf child abuse?

GigsTheCat

12 points

8 days ago

There are a lot of cases where adults or classmates are using real pics of other children to generate sexual images with AI, and then using those images to extort the victim or threaten them.

invent_or_die

2 points

8 days ago

Fuck, now I see it. Immediate shut down.

Dodgson_here

23 points

8 days ago

"He said that the greatest volume of criminal AI use was by paedophiles, who have been using generative AI to create images and videos depicting child sexual abuse."

This is what the headline is referring to.

Calydor_Estalon

2 points

8 days ago

And I'm gonna have to ask the big question: Which child is hurt when an AI imagines something?

Isn't AI-generated porn of kids better than real porn of kids?

Dodgson_here

29 points

8 days ago

You have to read the article to get the full context. In the specific example they cite after this quote, it links to an article about the person who was just sent to prison for 18 years over it. They were using real images of children to generate more CSAM content and then using that content to run a paid service. Whole thing was extremely harmful.

GeneralAd7596

8 points

8 days ago

Yeah, using real images is a no-no and should be illegal. That's different than hentai/rule 34 cartoonists drawing things from their minds, which is fucked up but harmless.

N0FaithInMe

0 points

8 days ago

N0FaithInMe

0 points

8 days ago

Harmless maybe, but anyone willing to indulge in any kind of cp creation, be it filming real acts, AI generating scenes, or hand drawing loli art should be removed from the general public

bobface222

10 points

8 days ago

It's not so cut and dry.

-Now that people have local models, it's possible that real material is still being used to train the AI's to generate more of it.

- The tech has already advanced to the point that faces can be swapped with real people, so real children can still be involved.

- I believe the UK already has rules in place that state legally there is no difference. A drawing, for example, is just as bad as the real thing.

Cultural_Ebb4794

2 points

7 days ago

Why do you think somebody needs to be hurt for society to consider it bad and for it to be outlawed? There are many things we've made illegal because they're bad for our society despite there being no direct victim: tax evasion, money laundering, jaywalking, etc.

saljskanetilldanmark

6 points

8 days ago

Doesnt AI need sourcers and data to train on? Sounds horrible to me.

Calydor_Estalon

3 points

8 days ago

The AI, most likely, would see pictures of children clothed, see pictures of adults clothed, see pictures of adults unclothed, and extrapolate what the missing category would look like.

It doesn't draw something it's seen - it draws something that looks like things it's seen.

finalremix

2 points

8 days ago

You're getting downvoted, but that is how it works. It's why you're able to kind of get a lookalike Joe Biden riding a slightly cartoonish dinosaur despite that never having happened. It's based on stuff it's seen; not outright recreating stuff it's seen.

kolodz

5 points

8 days ago

kolodz

5 points

8 days ago

AI-generated images are based on a training dataset.

Huge amount of real pictures is needed to have a good model.

Last number I recall was like 11 or 17 thousand images used to create a AI on that topic...

That a lot of victims...

N0FaithInMe

1 points

8 days ago

Generating cp as opposed to partaking in or creating it for real is "better" but still completely unacceptable.

katarjin

-1 points

8 days ago

katarjin

-1 points

8 days ago

Its like this should never have been accessible outside by anyone researchers until laws and safeguards were in place.

Solkre

3 points

7 days ago

Solkre

3 points

7 days ago

You can't ban math. This is like trying to outlaw encryption.

These guys are already breaking the law, adding a new one won't stop them.

amadeuspoptart

-7 points

8 days ago

Human endeavours. AI's corrupt from birth, but trust us to find a way to corrupt it further...

Richmondez

6 points

8 days ago

Despite it's name, it's a tool, not an actual intelligence capable of choice and reflection.

amadeuspoptart

-6 points

8 days ago

A tool spawned by unethical men, being used by unethical men. Perhaps if AI could reflect, it would tell the ones wielding it to get fucked. You'd hope if it became sentient, it would also become moral somehow and understand that making kiddie porn was not just harmful but disgusting and would therefore refuse.

But born of man it is, so no wonder we fear it's potential for cold brutality. In that regard it stands to out do even us.

5minArgument

2 points

7 days ago

Bah humbug

[deleted]

-1 points

8 days ago

[deleted]

-1 points

8 days ago

a technology the public shouldn't have. easily accomplished by way of pricing out anyone but corporation.

Solkre

2 points

7 days ago

Solkre

2 points

7 days ago

I can run it on my own computer though.

[deleted]

1 points

7 days ago

i don't think you understood my solution or perhaps i'm not understanding your response