This post contains mature images that have been obscured, but may still offend some readers.
As Vine, Twitter’s new 6-second-video sharing service is swept up and spun around by the inevitable pornado, it’s worth looking at how Instagram handled a similar issue, and where they’ve failed.
Social networks and porn don’t mix well. For a social network to click with the mainstream, people need to be sure of a few things: safety, privacy, minimal advertising, and freedom from objectionable images and content. Facebook, for example, has a strict no-nudity policy. If they become overzealous with their banhammer at times, people should remember they’re trying to strike a tricky balance between freedom of expression and a smut-free, unobjectionable environment.
Twitter has far fewer limitations than Facebook, and you can find porn there without muchtrouble. Thus, it should have surprised no one when Vine became a hub for 6-second clips featuring nudity and graphic sex. Vine is making some headway against it, but the simple fact is that they will never completely succeed.
Instagram–recently acquired by Facebook for $1 billion–fought similar problems for a long time. It’s hard not to see their recent, relative success reining in porn as an effort to keep Facebook happy.
For those unfamiliar with the service, Instagram allows people to quickly and easily share photos from mobile devices. Users can apply various hipster filters to make the photos look more stylish (usually by wrecking the contrast), add a caption, include a hashtag like #mycutechicken, and send them off into the aether. Other users can like the photos and make comments on them.
I set up a personal Instagram account in the early days of the service, but never really used it much, nor did I go looking for porn, even for research purposes. (More recently, I set up a public account, for those of you fascinated with pictures of chickens and coffee mugs.) I still don’t really like it all that much, since it repeats things done just fine by Twitter and Facebook, but I can see its appeal in a visually-oriented culture. Nude or sexually explicit images are banned by Instagram’s EULA, but for a long time hashtags like #sex, #porn, #nude, and myriad variations on the theme were common. If you searched for one of those tags, you got porn.
The service is very popular: not Facebook-popular, but big enough. Instagram reports 90 million monthly active users, 40 million photos per day, 8500 likes per second, and 1000 comments per second. They suffered a reversal of fortune when a change in policy claimed rights to photos shared on the service; but a public backlash and a sharp dip in user numbers caused them to back down, for now. Nonetheless, daily users of the service plummeted by 50%, from about 16 million per day down to 8 million. Those numbers will rebound, but exactly how much is an open question.
The thing that surprised me is how many parents feel comfortable letting kids use the service. As I said, it’s becoming safer, but it’s still not safe, and it’s not a place where kids should be hanging out. Instagram accounts are only allowed for people age 13 or up, but many, many far younger children are using it in violation of the policy.
There are no parental locks or protections for the flow of pictures on Instragram, which limits parents to an honor policy in which a child using Instagram 1) has a private, not a public, account; 2) only accepts “followers” who are known to the parent, and 3) never, ever searches for hashtags or browses around in the public photo stream. I’ve found no way to lock out searches, which means a kid can punch in a hashtag search and find himself in a photographic wild west without the parent ever knowing.
Instagram searches no longer return hits on obviously sexual words, but explicit images can pop up anywhere and display on a child’s screen before Instagram has a chance to delete them. This post is illustrated by screen caps I took in ten minutes of Instagram searching, and shows the juxtaposition of kids and risque pictures.
One thing parents probably don’t know is that explicit content can be given any tag. You can find a man revealing his wedding tackle in a picture tagged #teddybearpicnic. Instagram will probably find and delete the picture, eventually, but is that really something you want to risk? And for what? So kids can share pictures of each other making duck faces?
One of the high-traffic hashtags is #Kik, which is the name of a messaging service for mobile devices.
First off, if you’re a parent and you have kids using Kik, you’ve done something extremely silly, and need to stop it now. Kik is free of content restrictions, and is a jammed full of pedophiles and pervs. If you want your kid propositioned for naked photos, by all means, let them have a Kik account.
Kik and Instagram have evolved a kind of symbiotic relationship, with people promoting their Kik handles on Instagram and vice versa. If you want to see some of the problems with Instagram, spend 10 minutes refreshing the #Kik search. In the middle of the day it was giving me about 10 new pics a second, and some porn crept past the censors before Instagram finally managed to delete it.
As you can see from the screen caps, there were still pics that didn’t violate the nudity/sex policy but which no parent wants a kid to see. There are clearly personal photos of sexual stalkers and perverts, and they are right next to photos of sweet little girls, all in the same photo stream.
Do you want your child to see pictures of and from any of these millions of strangers every day? Would you let these strangers bring their pictures into your home?
Oh, and one more super huge problem: geotagging and location services. It’s very, very easy to accidentally tag a photo with a precise location, such as the home of the child who took it. Let that sink in for a minute: if your daughter shares a picture of herself without knowing that the geotagging is on, anyone looking at that stream could know where you live. There are apps and sites that can aggregate this info into a kind of stalker map.
So, what’s the verdict on Instagram? It’s gotten safer and more smut-free since the Instaporn flood of last fall, but it’s not out of the woods. It needs stricter controls and better parental locks, but even then, what you have will not be wholly safe, and the benefits for kids are little to none.
If you give in because “all the other kids are doing it,” then you’ve bought a grand old line of BS that’s been responsible for bad parental decisions for generations. Because, you know, everyone uses that line. At one point, no kids were using it, but little by little, this mob psychology takes over and affects a change in parental behavior. I’ve spent an entire career in the media observing the same phenomena, particularly with games, and this is no different.
Kids need to stay away from the search features
If you still intend to let your kid use Instagram, there are some things to minimize the risk:
- A child’s account must be private.
- People must be known to you to be approved.
- Kids cannot add friends without permission.
- They cannot search for photos or use hashtags.
- And it is imperative that the location tagging is turned off.
As for Vine, it may never be safe, because smut peddlers can embed a single frame of porn in a six-second clip, making it much easier to slide past the censors. Twitter is a long way from getting a handle on the problem, and if you want evidence, here’s what I found in my first 30 seconds of using the service. Note that the tags includes #pets, #magic, and #howto, meaning the person who posted is looking to snare people who want non-pornographic content.