As I wrote about last week, there was earlier this month what a Fox News article called a “Grok AI scandal.” Per that piece and another in The New York Times, it involved “a recent flood of Grok-generated images that sexualize women and children,” including an “image depicting two young girls in sexualized attire,” and “new images” of “a young woman,” with “more than 6,000 followers on X,” wearing “lingerie or bikinis.” Responses were swift, with the British prime minister saying that “the images were “disgusting” and would “not be tolerated,”” a Brazilian official threatening to ban the products, three United States Senators requesting removal of X and Grok apps from stores, and a Grok spokesperson publicly apologizing with the company greatly limiting user access to even unrelated image generation.
The problem,
though, will not go away. The technology
is here to stay, and it is certain that no matter how many restrictions
companies place on their chatbots and other products, there will be many who
can use it to turn people in photos into electronic paper dolls. So how can we understand the situation and
both identify and sensibly enforce violations?
The situation
now, outside AI, gives a mixed bag. On
one side, sexual exploitation of people under 16 or 18, especially girls who receive
more emotion, has become the most hated crime, with intensity, over the past
decade or two, far exceeding illegal drug sales and distribution during the
Just Say No 1990s. Over the past 30 years or so, teenagers have been
cloistered, almost never appear in public alone, and have been increasingly
referred to as children, which, with their lower-than-before experience
negotiating with adults, is more appropriate than it was in, say, the
1960s. On the other side, the line
supposedly crossed by the scandal is hardly firm, as bikinis and underwear,
along with leotards, short skirts, and other scanty apparel items, are freely
available, and pictures of them being worn by children and adults of all ages
are unrestricted and easy to find. Sex
appeal, and what we might call semi-sex appeal, which can approach or even
cross commonly recognized lines with the blessings of the paying subjects,
have been used to sell products
and publicize actual and would-be celebrities for centuries or more. Female puberty has started much earlier –
according to two sources, it moved from an average age of 17 in 1900 to 10 in
2000. In the meantime, what arouses
people sexually remains a strongly personal matter, and any picture of any
person would turn at least a few on.
What should
be illegal? Beyond the most obvious of
displaying genitalia in certain states and poses, it is not clear at all. What is and is not pornographic has long been
notoriously hard to strictly define. But we badly need to do just that. Here are some insights.
First, all
children are not the same. Men and older
boys are hard-wired to be sexually attracted to females capable of
reproduction. Fittingly, psychiatry’s Diagnostic
and Statistical Manual of Mental Disorders, long the industry’s premier
desk reference, at least 40 years ago required, for a diagnosis of pedophilia,
that the attraction objects be prepubescent.
That does not justify any sexual exploitation, but tells us that having
feelings for adolescents calls for self-discipline and self-restraint, not a
judgment of deviance.
Second, at
the start, people need to regard all shocking pictures of people with faces
they recognize as bogus. The laws will
protect us when that is justified, but those things will not go away either. Pictures of people with unknown faces should
be attributed as genuine to nobody.
Everyone, from victims to parents to cultural observers to judges and
lawmakers, should share these beliefs.
What we need,
then, is a three-by-three matrix, with “pornographic,” “strongly suggestive,”
and “acceptable” on one dimension, and “apparently prepubescent,” “apparently adolescent,”
and “apparently adult” on the other. For
adults, only nonconsensual pictures could be considered problematic. “Strongly suggestive” should have high
standards, and such cases as the above, where one influencer was dressed in
clothing used by other influencers, should not qualify. We will need to rigorously define, as hard as
that will be, the borders of these cells.
For example, is lingerie more provocative than a bikini, when many males
respond more to the latter? Such things
will not be easy to classify, but the existence of chatbots has given us no
alternative.
Once we have
defined which cells only contain illegal material, we will need to consider their
identifiability. Purely AI-generated
images with composite faces, not using photos of actual people, should
generally, if appropriately and boldly tagged or watermarked, be legal not only
for adults to generate for their own private use but to share with consenting
adult recipients. If that sounds
extreme, consider this: Wouldn’t it be a
wonderful contribution for AI to send actual child pornography, with its accompanying
damage to actual children, to extinction?
That could happen, as the synthetic photos and videos could be
custom-made to the consumer’s tastes and have much higher quality, and the lack
of often devastating criminal penalties would contraindicate genuine material for
virtually everyone.
It is not
surprising that something we have been unable to solve might fall into place if
we think new thoughts. If we stay
legally muddled about images, many will suffer.
If not, we can get real freedom and more human happiness. Which would we prefer?
No comments:
Post a Comment