Videos and Decency

High Tech but Low Standards
TikTok app logo is seen in this illustration, August 22, 2022. REUTERS/Dado Ruvic/Illustration/File Photo

Last week, the U.S. Congress passed, and President Joe Biden signed into law, the annual, almost two trillion national budget that has a short, last-minute-added provision prohibiting the use of the social video site Tik Tok in federal government operations. For a few years, there has been complaints by mostly Republicans and conservatives that the widely-successful Chinese-owned site was also a secret Chinese technology that threatened U.S. security.

The new law was criticized, not only for being political, but also for ignoring a major issue in Tik Tok and also in major American social video companies, such as Facebook, Twitter and YouTube – Indecency.

Actually, the American companies’ social videos seem to have become more sexual so as to compete with Tik Tok, which has been dominating not only world-wide audiences, but also advertisements – mostly of American companies.

On the steps of Tik Tok, Facebook and Instagram have “Reel;” YouTube and Snapchat have “Shorts;” and Twitter is said to be planning to join.

The problem of showing sexualized content started with the creation of the social videos, and technical algorithms often prioritize sexual content because it attracts more audience.

Following are excerpts from opinions by three experts on the subject, from their websites or as quoted in the media:

First, Shane Blackheart, author of books on fantasy and horror, described some of the sexual videos that showed up on her phone without her consent.

Second, Christian Sandvig, the director of the Center for Ethics, Society, and Computing at the University of Michigan, said that sexualized contents are intentionally introduced to increase audience and followers.

Third, Christine Pai, spokeswoman for Instagram, explains methods that social video companies use to detect and delete obscene videos.

Shane Blackheart: “Mostly Young Women”

“When I started using TikTok last year, it was supposed to be related to my work because I wanted to connect with fans and reach new readers on the social-video sites. Needless to say that social videos have become hugely influential in the book world …

“What did I find?

More than a few videos of women in outfits that were barely covering any skin, like lingerie or see-through things. Or just half-nude women dancing or modeling something, or things similar.

I didn’t linger too long to see much more because those videos made me uncomfortable …

“My research showed me how common that problem was, as content were intentionally designed to get more watchers and followers, including suggestive clips of women or things meant to shock.

The unfortunate result was that people find themselves unable to get the sexual videos out of their automated feeds, despite never liking them or following those creators …

“Researching responses from those video companies, I found that they expected people to “train” their own feed to stop showing unwanted videos. But that takes time and isn’t as effective when pitted against human nature …”

Christain Sandvig: “Algorithms”

“According to the numbers of likes and comments on many of the sexualized videos, they are broadly popular.

Social videos’ companies say that algorithms could detect a view’s taste, in any subject, and could work to accommodate that taste. And, in the case of sexualized videos, the companies said it was only a problem for a beginner – until the logarithms adjust his/her taste.

Those companies call the problem a “cold-start” problem, because they couldn’t make a behavioral prediction without any behavior …

“Research that I participated in showed that virtually any algorithm may deserve scrutiny. It is true that a search algorithm that did not satisfy its users would be unlikely to continue operation for very long. But it is important to note that most situations in which algorithms are employed permit the algorithms to satisfy multiple goals simultaneously.

There are also many ways an algorithm might be “rigged” that are normatively problematic. Therefore, public interest scrutiny of algorithms is required, so as to focus on subtle patterns of problematic behavior and that this may not be discernable directly or via a particular instance.

It is important to consider that algorithms can be manipulated in ways that do not disadvantage their users directly or obviously …

“A group of female uploaders discovered an effective strategy to increase audience and advertizing revenue. These “reply girls” produced videos whose thumbnail depicted their own cleavage then marked these videos as responses to content that was topical and was receiving many views …”

Christine Pai: “Doing Our Best”

“Our ranked feed and recommendations for new users prioritize posts and recommendations we think users are more likely to enjoy based on what is trending or popular. Or based on interactions.

WE understand we may not always get it right. But we are working on improving suggestions and customization tools for new users …

“As part of our dedication to our audience, we have shut-down Boomerang, standalone app so as to better focus our efforts on the main app. But Boomerang is still supported in-app in Stories. And we will continue working on new ways for people to be creative and have fun on Instagram …

“Boomerang from Instagram launched back in 2014 and allowed users to create mini videos from a burst of 10 shots.

With 301 million lifetime global downloads, Boomerang was a famous app and people were still downloading the app at the time of its removal.

But, the removal is from standalone, not from the main app …”


Related Articles