This conversation usually occurs when parents are driving, cooking or otherwise engaged, and the idea of a “free” babysitter to occupy children’s attention for a much-needed ten or fifteen minutes can be too tempting to pass up.Besides, what harm can there be in the latest “Angry Birds” game, “Hunger Games” app or sharing pictures on Instagram? Quite a lot of harm, as it turns out, but the biggest danger is what you don’t know.
The app market has grown and changed so rapidly that parents find it difficult to keep up. They often don’t know how some familiar or innocent-sounding apps work, don’t understand the app’s security and underestimate the harm a bad app can do.
Beyond that, they rely on app stores to keep out inappropriate material from kids, not realizing that these marketplaces, as well as federal regulators, are still finding their footing with these processes. Luckily, parents and kids can still learn and do a lot to make sure app usage is appropriately child-friendly.
That Sounds Innocent Enough
Parents think keeping their kids off Facebook will limit their ability to chat online, but apps built around hobbies often have a social media function that allows them to connect online with anyone.
For example, Instagram — often promoted as a great app for young photographers for its array of picture filters and photo tools — also provides a platform to share and comment on the pictures. The app’s photo sharing system lets kids post and comment on other’s images, driving
millions to join Instagram for its social network fun. The app’s surge in popularity may help it bolster Facebook, whose purchase of the smaller network was finalized this summer.
“The Hunger Games” movie proved a hit at the theaters, as did an app of the same name, which was wildly popular well into the summer. But the app’s chat capabilities ended up putting one girl at risk after she got chat invitations from three different unknown people asking her inappropriate questions. The 12-year old, who downloaded the app on her iPod Touch with her mother’s permission, wrote to Dear Abby about the situation, which continued to haunt her even after her mom helped her delete the account.
“I want your readers to know this can happen and there are chat room apps for iPods,” the girl wrote. “I get good grades in school, but these guys almost tricked me into doing something I didn’t want to do.”
Dress-up app “Top Girl,” free for iOS and Android, seems harmless enough, but the content is much more grown-up than playing with Mom’s makeup or heels. The goal of the app is to dress hot enough for the best clothes and modeling gig — players go to the bar, grab a hot guy and take him home.
Parents from Oregon to Ohio expressed outrage over the game, rated for kids aged 12 and up, for its sexist and adult content — if the character isn’t “hot enough,” the player gets “sent home.” Some parents take issue with the game feature that provokes users to play — it sends out not-so-nice notifications that other girls are surging ahead and she needs to check in and play to stay near the top.
Reviewers of the game cite its “addictiveness” and give it high ratings, but parents concerned about the message, and the in-app purchases available, aren’t so thrilled. Parents who think “Top Girl” is a harmless app might find the app’s emphasis on superficially enhancing your looks to reel in a guy contradicts with the values they want their daughters to embody.
Apps can also endanger children’s health, and their availability may be surprising to some. Earlier this month, Australian researchers reported finding more than 100 apps in the Apple and Android app stores that promoted smoking, some of which have been downloaded millions of times.
These apps illustrate how tobacco proponents are pushing into this appealing channel, which is loosely regulated, sold worldwide, increasingly popular and open to kids.
Reaching young audiences is difficult for tobacco companies. The 1999 Tobacco Master’s settlement, which ended multiple states’ Medicaid lawsuits against the tobacco industry for recovery of their smoking-related health-care costs, banned tobacco companies from advertising, sponsorship, lobbying and other activities targeting youth. The settlement, however, doesn’t mention smartphone apps.
The researchers, led by Nasser Bin Dihm of Syndney Medical School, wrote their finding “identifies a new trend of promoting tobacco products in a new medium with global reach, a huge consumer base of various age groups and less strict regulation policies.” The identified apps included images of particular brands, provided information on where to buy tobacco products, supplied cigarette brands’ packaging to use as wallpaper on their device, and let users simulate smoking behavior.
These apps could also easily attract teens and children due to their high-quality graphics and availability under the “game” and “entertainment” categories in the app stores, and could potentially increase teens’ risk to pick up the habit.
Not all the smoking-related apps are coming straight from tobacco companies, but since many of them are released by developers who work under nicknames rather than business names, the apps raise a few red flags.
selected five examples of what they called the most creative pro-smoking apps described in the study.
The “myAshtray” app, for example, lets users click on the screen to drop cigarette ash into a virtual ashtray, while the “Cigarettes” app provides specs packaging photos and global availability for major cigarette brands. “Puff Puff Pass” is a cartoon game where players click to make virtual characters smoke and earn points by passing a cigarette, pipe or cigar quickly between characters in a designated order. The “Cigarette Battery Widget” uses a lit cigarette icon to display remaining battery power on your smartphone, and the “CRA App” lets users stream audio and video related to cigar regulation and provides information on how to get involved in Cigar Rights of America advocacy group.
Aren’t These Regulated?
The Federal Trade Commission’s consumer protection laws aim to enforce privacy protections to smartphone apps, but these measures often fail to keep up. The agency is joining forces with app distributors and developers to ensure tech companies follow the protections of the Children’s Online Privacy Protection Act, or COPPA.
COPPA requires companies to secure verifiable parental consent for users aged 12 and younger before collecting children’s personal information. This is usually done with an e-mail sent to the parent. The sites also are required to clearly identify that they are child-oriented and collect personal information such as e-mail addresses, names, phone numbers and demographic data.
The FTC is in the process of updating COPPA, which was written in 1998 before a majority of U.S. youths owned smartphones and apps began tracking locations. Several proposals are under the agency’s consideration, including better definitions and making host sites, like Facebook, Twitter, Reddit and Digg, responsible when partner sites violate the law.
Silicon Valley is protesting these proposals, saying the requirements would be unworkable and stifle freedom of expression. Regulators and privacy advocates, however, say the Web’s most powerful firms should take more responsibility for the benefits they get from their online presence. The FTC is expected to issue its decision on new rules by the end of the year.
COPPA, and proposed changes to strengthen it, create a growing problem for app stores and ecosystems, and may lead them to steer clear of curating children-directed apps, according to Morgan Reed, executive director of the Association for Competitive Technology (ACT), an advocacy group that represents more than 3,000 small and mid-size app developers and IT firms.
“This puts application stores or platforms at risk of being liable under COPPA for receiving and managing verifiable parental consent for every single application that they have ‘reason to know’ might be directed at children,” Reed says.
For now, the combination of children often using their parents’ phone, having the ability to provide false birthdates and the complicated nature of these apps make compliance difficult.
What’s a Parent to Do?
Instagram and other social media sites are designed to only accept applications for accounts for children aged 13 and older. The sign-up process is not authenticated, but parents who are aware of their child’s activity can disallow it. In general, checking in with your child to see what new apps or games they use can also help ferret out activities you don’t endorse.
Privacy settings are usually set to public by default, so you and your child will need to restrict these so information isn’t available for viewing by just anyone. Make sure you also disable the geolocation features, which allow pictures to be tagged according to location. The combination of geolocation and public settings can broadcast the images, comments and location of your child’s latest birthday party to anyone.
Parents can also block in-app purchases and app downloads. Some settings disallow all downloads or just those rated for older ages than the child on both Apple and Android devices. Also, tying your iTunes account with your child’s and using your email for receipts can give you a heads-up on your child’s app downloading habits.
Talking with children and setting up guidelines — including how many hours kids can use their devices, what they can view, and when they can text message — can go a long way to starting the conversation. And in this fast-paced world, you need to keep the dialogue going. New products come to market and kids learn new tech tricks all the time, so staying in the loop can help ensure both you and your child use using this incredible technology for the many good things it can bring.