[ rss / options / help ]
post ]
[ b / iq / g / zoo ] [ e / news / lab ] [ v / nom / pol / eco / emo / 101 / shed ]
[ art / A / boo / beat / com / fat / job / lit / mph / map / poof / £$€¥ / spo / uhu / uni / x / y ] [ * | sfw | o ]
logo
technology

Return ]

Posting mode: Reply
Reply ]
Subject   (reply to 28931)
Message
File  []
close
uk-govt-crest.png
289312893128931
>> No. 28931 Anonymous
7th January 2025
Tuesday 10:57 pm
28931 Online Safety Act
This comes into force 16.March.2025, so in a little over 2 months. For reference, the legislation is here: https://www.legislation.gov.uk/ukpga/2023/50/contents

It affects all sites where users can interact if they have visitors located in the UK no matter their size or reach despite the frequent use of phrasing similar to "social media companies" which might lead one to believe it's aimed only at the Facebooks and the Googles of the world. Worse, WhatsApp, iMessage and Signal all expressed concern that act would be an end-run around truly e2e encrypted communication thanks to the requirements set out in section 122.

Did I miss the discussion here, given this will affect britfa?
Expand all images.
>> No. 28932 Anonymous
8th January 2025
Wednesday 1:38 am
28932 spacer
I could have sworn there was a thread, but I can't find it now. I even asked Google to search here for me.

But if there isn't a thread, then this must be the first time I'm making these points: such a far-reaching law will be unenforceable, so nothing will change for most of us. But having it as a law will mean they can prosecute any website they choose to, and that is undeniably worse than the status quo.
>> No. 28933 Anonymous
8th January 2025
Wednesday 3:58 am
28933 spacer
>>28932

Ofcom has about 900 staff. They are responsible for regulating TV and radio, phone and broadband services, wireless spectrum licensing and Royal Mail. They have now been tasked with enforcing very complex regulations across most of the internet. To the surprise of no-one, they have been allocated no money whatsoever for this incomprehensibly vast task.
>> No. 28934 Anonymous
8th January 2025
Wednesday 8:50 pm
28934 spacer
>>28932
You're right, found it. It's on shed:
>>/shed/16013
>> No. 28935 Anonymous
8th January 2025
Wednesday 11:06 pm
28935 spacer
>>28931
It's mental how much you can get away with if you just use "think of the children" as a justification to do whatever you're about to do.
>> No. 28987 Anonymous
2nd February 2025
Sunday 6:33 pm
28987 spacer
This could go in any one of four or five threads we have, but let's use this one.

https://www.bbc.co.uk/news/articles/c8d90qe4nylo

>Four new laws will tackle the threat of child sexual abuse images generated by artificial intelligence (AI), the government has announced.
>The Home Office says the UK will be the first country in the world to make it illegal to possess, create or distribute AI tools designed to create child sexual abuse material (CSAM), with a punishment of up to five years in prison.
>Possessing AI paedophile manuals - which teach people how to use AI for sexual abuse - will also be made illegal, and offenders will get up to three years in prison.

Now, I'm not a paedophile and I don't really use AI, so I'm not affected by this, but it's this next bit that has made me furiously angry:

>"What we're seeing is that AI is now putting the online child abuse on steroids," Home Secretary Yvette Cooper told the BBC's Sunday with Laura Kuenssberg.
>Cooper said AI was "industrialising the scale" of sexual abuse against children and said government measures "may have to go further."
What the fuck is this brazen fearmongering? When someone molests a child and takes a picture of it, both acts are crimes, but the child molestation is the evil thing. Taking a photo of an evil crime is, in my opinion, less evil than actually committing a crime. But here we have new official government policy: pictures that are not photographs, which depict acts that never happened, are now officially ONLINE CHILD ABUSE ON STEROIDS. Fuck off.

It's crazy how passionately these people Think Of The Children when it's about kneejerk idiocy, but they never once think of the children when it comes to the environment, or house prices, or jobs, or quality of life.
>> No. 28988 Anonymous
2nd February 2025
Sunday 6:56 pm
28988 spacer
>>28987

Typical daft Labour nanny state shite we knew we were in for. One lad here once compared them to a party half made up of primary school teachers and it's stuck with me ever since. When all you have is corner time, everything looks like a kid who hasn't brought the right PE kit.

.. Okay but to be less facetious and more cynical, it's mostly optics. They pick a thing which isn't really a big problem (that part's crucial because that would mean they need to actually do something about it, which they know they can't and won't) but is very emotive, and paedo hysteria is perfect for that. Then they can be seen to be CLAMPING DOWN and COMING DOWN HARD and BRINGING DOWN THE HAMMER and other variations of things impacting from above, without really actually having to do much of anything at all.

And of course the cherry on top, you can't say nowt about it because you don't want to look like a paedo sympathiser, do you? What you standing up for them sickos for? You a carpet-bagger or summat? You should be castrated and hung up by your own bollocks you piece of inhuman filth.

Remember how Paedogeddon andstuff like the Monkey Dust sketches with the Paedofinder General were all under Labour, but under the Tories that all went away? Weird that I always thought.
>> No. 28989 Anonymous
2nd February 2025
Sunday 6:58 pm
28989 spacer
>>28987

>What the fuck is this brazen fearmongering?

Also, most online AI image generators are now designed to reject requests that could lead to problematic images being generated. Most of them won't even let you create plain adult nudity. Try it, they'll abort with something like "Problematic content detected". So I doubt they're more permissive when you type in something that you hope will generate AI child porn.

As with most legislation on these issues, you've got policymakers who haven't a fucking clue about the actual technology. More often than not, they're in it for the public renown they'll gain from it as somebody who does something, however ineffective, to combat a social ill, and their advisors are either nutjob zealots who want to censor the whole Internet, and if they had their way then you'd be reported for downloading one image of an adult tit, or they're just as clueless as that policymaker.

There isn't much you can reasonably say against child pornography being illegal, and those who deal in it being prosecuted. But that's not what this is really about. Not by a mile.
>> No. 28990 Anonymous
2nd February 2025
Sunday 8:41 pm
28990 I did my best to be nice, despite finding you all very annoying.
>>28987
>pictures that are not photographs, which depict acts that never happened, are now officially ONLINE CHILD ABUSE ON STEROIDS. Fuck off.

Men get so emotional about this kind of thing (being told it's not okay to wank to children).

I fear you're in too much of a huff about this to give a shit, but you're dead wrong on this. Like any other image produced by generative AI, those that dipict CSAM have been trained on real images of the acts being committed. If you don't see a wider problem with an ability to mass produce and disseminate images of CSAM, abstracted from their original sources as it may be, you're very much living up to every stereotype of the modern libertarian I know of. Just as a thought experiment of sorts, would you be okay with a bunch of shitty, fake, Beatles songs being farted out by generative AI tools? I imagine on some level you'd think it was a misuse of the music of one of history's greatest bands to do that. So if we hold that line, I think we can go a little further when it comes to real life CSAM being turned into "AI generated" CSAM.

Here's an article that does a better job than I can be arsed doing explaining the matter, from our good friends at 404 Media: https://web.archive.org/web/20240530142245/https://www.404media.co/ai-generated-child-sexual-abuse-material-is-not-a-victimless-crime/

I'll be honest I don't see what the issue is with what Cooper said. It's a touch overwrought, but it's not like she said she's going to unplug the nations WiFi, not that I'd notice if she did, living out in the sticks as I do, but I digress. This seems like a rather rational area of the law to update and I don't see there being any risk to those who aren't specifically generating, or promoting the generation, of CSAM.
>> No. 28991 Anonymous
2nd February 2025
Sunday 8:48 pm
28991 spacer

nowandthen.jpg
289912899128991
>>28990
>would you be okay with a bunch of shitty, fake, Beatles songs being farted out by generative AI tools?

Apparently Paul and Ringo are fully on board with it.
>> No. 28992 Anonymous
2nd February 2025
Sunday 9:00 pm
28992 spacer
>>28991
I don't know what you're talking about. Whatever it is, I've totally purged even my ability see it. You're whole post is a blur, a smuge even. You have my sincerest apologies for your wasted efforts.
>> No. 28993 Anonymous
2nd February 2025
Sunday 9:23 pm
28993 spacer
>>28990
You did a very good job of being nice, so thank you for that. I still disagree the Beatles are shiiiiiit man but I will concede that my position on this comes almost entirely from the research which has shown that paedophiles with access to CP are much less likely to attack a child in real life, but the Wikipedia page suggests this outcome might not be as guaranteed as I thought:
https://en.wikipedia.org/wiki/Relationship_between_child_pornography_and_child_sexual_abuse

There is also the argument that different AI tools can be used to train people in ways to molest children in real life, which I obviously oppose enormously, and the idea that AI CP can be used to blackmail real-life children. If someone had their own personal offline AI for that purpose, which is perfectly doable, I'd be happy for them to get their bollocks torn off. And I suppose the law has done its best to ban those local AIs without banning all of them; it's just really not how I would approach it.

>I'll be honest I don't see what the issue is with what Cooper said.
Child abuse on steroids would result in more children being abused. To say that it's "child abuse on steroids" when people make more images without abusing any children, feels to me like she thinks that pictures and stories are worse than the actual acts. It's like the crime of thinking about it is worse than the crime of doing it. But I don't think there's anything you can think about where the thought in itself is worse than the crime of actually, physically sexually assaulting a child.
>> No. 28994 Anonymous
2nd February 2025
Sunday 10:44 pm
28994 spacer
>>28990

>Just as a thought experiment of sorts, would you be okay with a bunch of shitty, fake, Beatles songs being farted out by generative AI tools?

Bit fucking late for that. But I'll not go into that whole tirade about how people woke up to the absolute state of the music industry not just after the horse has bolted but after it had found a pack in the wild, sired several generations of offspring, died of old age, and rotted back into compost.

Anyway you are quite right that obviously, images of child abuse should be illegal, and AI images trained on real images of it should also be illegal. The substance of this is how exactly they plan to police it. I'm of a mind to compare it with drugs enforcement, where in this analogy the best they'll be able to do is make an example of a few wrong 'uns with a carpet-baggery version of Stable Diffusion on their computer, like the porn equivalent of catching a lad with a 20 bag of weed and the bigger dealers up the chain getting away untouched.

But you also have to ask other biggers questions, like what happens if you have your own completely upstanding porn AI that you use for adult consensual legal images for personal use, but without your knowledge there was something illegal or carpet-baggery in the training data? Are you going to be liable? These are the sort of things I just don't trust our government to get right.
>> No. 28995 Anonymous
2nd February 2025
Sunday 11:48 pm
28995 spacer

image(1).jpg
289952899528995
>>28990

>Like any other image produced by generative AI, those that depict CSAM have been trained on real images of the acts being committed.

Not necessarily. It's certainly easier to get a generative model to produce things that are very similar to the training set, but a reasonably sophisticated model can interpolate and generate images based purely on conceptual understanding. I doubt that Grok has ever been trained on an image of a pistol made of cheese, but the model weights contain the concepts of "pistol" and "cheese" and so can trivially combine the two.

It's also extremely difficult to prove exactly what was in the training set of a model. Models don't retain a copy of their training set, which is why training a language model on literally every book in existence doesn't count as copyright infringement. I'm sure that a British judge and jury could be convinced that there's no smoke without fire if they're sufficiently outraged, mind.
>> No. 28996 Anonymous
3rd February 2025
Monday 12:02 am
28996 spacer
>>28991
>Now and then

Keep this up and you lot are going to summon him.
>> No. 28997 Anonymous
3rd February 2025
Monday 12:19 am
28997 spacer

no.jpg
289972899728997
I'm sorry, I just don't watch enough anime to understand the levels of carpet-baggerry taking place in this thread.
>> No. 28998 Anonymous
3rd February 2025
Monday 12:40 am
28998 spacer
>>28995

>Not necessarily. It's certainly easier to get a generative model to produce things that are very similar to the training set, but a reasonably sophisticated model can interpolate and generate images based purely on conceptual understanding.

Right. You could just feed that model entirely inoffensive pictures of clothed children, but that model will probably have gained an idea of the nude human form from other pictures, and could extrapolate from that.

Not saying anybody should do that. Ever.
>> No. 29040 Anonymous
18th March 2025
Tuesday 11:42 am
29040 spacer
This comes in today does it not? Are we safe?
>> No. 29041 Anonymous
18th March 2025
Tuesday 11:50 am
29041 spacer
>>29040
I saw people on 4chan counting down to midnight on Sunday night. Certainly, nothing has happened since then. However, it might take time for things to be reported to Ofcom and almost certainly ignored.
>> No. 29043 Anonymous
18th March 2025
Tuesday 8:48 pm
29043 spacer

GmUTXrjaQAAqJfr.jpg
290432904329043

>> No. 29044 Anonymous
18th March 2025
Tuesday 9:43 pm
29044 spacer
>>29043

What illegal content exactly did they expect they'd have to police?

Now, correct me if I'm wrong because I don't know the full ins and outs of this legislation. But it wounds to me like all a "platform" (ie site, forum etc) needs to do to be more or less safe from this is to have mods who actively delete and remove carpet-baggery, gorey, or daft woggery content.

I don't forsee them actually following through on any of the age verification stuff because that's just plainly unworkable. I think most of these forum owners who are shitting it and shutting down either wanted an excuse to jack it in years ago, or they are just giant fucking fannies.
>> No. 29045 Anonymous
18th March 2025
Tuesday 11:23 pm
29045 spacer
>>29044
I got curious, so I looked up if LiveLeak was still around, since I consider that to be a "bad" website. Turns out it closed down in 2021, but I got a few other gore websites suggested to me and yes, British netizens can still watch a dead naked woman have her head cut off. Perhaps there was more to the video, but I'm not that kind of person so I didn't watch all of it.

If that website is still up, and Facebook and Twitter are still up, and YouTube is still up, then yes, I think the sites that are closing down are just needlessly scared of a law that's entirely worthless. I've never heard of The Hamster Forum, but I can easily believe that it's run by people who don't realise just how hard it is to actually get in trouble nowadays.
>> No. 29046 Anonymous
19th March 2025
Wednesday 12:15 am
29046 spacer
>>29041
I like the idea of some low-level civil servant at OfCom having to evaluate .gs but not being too internet-savvy so they just assume all those names on IQ are real usernames and Purps finds himself in chains for hosting Eskimophobia.

>>29043
Has anyone heard from Richard Gere lately?

>>29044
>But it wounds to me like all a "platform" (ie site, forum etc) needs to do to be more or less safe from this is to have mods who actively delete and remove carpet-baggery, gorey, or daft woggery content.

I think the challenge is that this requires effort when a lot of hobby websites are run by one bloke who might log in once a week to clear spam and talk about fox proofing the garden. A lot of them could pack it in, especially when social media giants can provide much of the same thing on their platform with the bonus that Zuckerberg will probably start calling Starmer a carpet-bagger soon.
>> No. 29047 Anonymous
19th March 2025
Wednesday 12:23 am
29047 spacer
>>29045

I was inspired to go hunting for that skiier video some lad mentioned the other day and stumbled upon probably the same places you were finding. It's pretty odd that all that sort of stuff is even still up on the clearnet at all really, I would have thought they'd pre-emptively have gone to the darknet ages ago.
>> No. 29048 Anonymous
19th March 2025
Wednesday 6:17 am
29048 spacer
>>29047
Nobody wants to use a paedobrowser.
>> No. 29049 Anonymous
19th March 2025
Wednesday 9:57 am
29049 spacer
>>29048

I prefer to think of it as a drugsbrowser but to each their own.
>> No. 29050 Anonymous
19th March 2025
Wednesday 10:09 am
29050 spacer
>>29049
Hasn't the world moved on from DNMs after all the exit scams?
>> No. 29051 Anonymous
19th March 2025
Wednesday 10:31 am
29051 spacer
>>29048
I wonder if they'd use that as an endorsement. "Nine out of 10 carpet-baggers prefer paedobrowser."
>> No. 29052 Anonymous
19th March 2025
Wednesday 10:48 am
29052 spacer
>>29050

Generally they just move to another DNM. There's a dead pool up on TorTimes I think where you can bet crypto on which one will exit scam next. It's just part of the life cycle now.
>> No. 29053 Anonymous
19th March 2025
Wednesday 8:10 pm
29053 spacer
>>29052
I know a lot of the big sellers started up their own Telegram bots.

Return ]
whiteline

Delete Post []
Password