

I wonder how plug-in hybrids are that bad in terms of gas-to-electric ratio. Might need a proper source on that. Most usage should be from work-home travel, and that should be reachable with just electric power.
Maybe the solution is not simply to block hybrids, but to solve the reasons they don’t drive electric. This could be putting more chargers at home/workplace or something else entirely (idk what the analysis pointed out as reason)

The problem is that it’s impossible to take out this one application. There doesn’t need to be any actual nude pictures of children in the training set for the model to figure out that a naked child is basically just a naked adult but smaller. (Ofc I’m simplifying a bit).
Even going further and saying let’s remove all nakedness from our dataset, it’s been tried… And what they found is that removing such a significant source of detailed pictures containing a lot of skin decreased the quality of any generated image that has to do with anatomy.
The solution is not a simple ‘remove this from the training data’. (Not to mention existing models that are able to generate these kinds of pictures are impossible to globally disable even if you were to be able to affect future ones)
As to what could actually be done, applying and evolving scanning for such pictures (not on people’s phones though [looking at you here EU].) That’s the big problem here, it got shared on a very big social app, not some fringe privacy protecting app (there is little to do except eliminate all privacy if you’d want to eliminate it on this end)
Regulating this at the image generation level could also be rather effective. There aren’t that many 13 year old savvy enough to set up a local model to generate there. So further checks at places where the images are generated would also help to some degree. Local generation is getting easier by the day to set up though, so while this should be implemented it won’t do everything.
In conclusion: it’s very hard to eliminate this, but ways exist to make it harder.