UK Pushes For Internet Industry Self-Regulation

It's been said that you can find virtually anything you are looking for on the Internet. Sometimes, you stumble upon some things you wish you didn't find, or at least hope that your children never see. Such are the concerns of the U.K. Parliament's Culture, Media and Sport Select Committee, which just released its report on "Harmful content on the Internet and in video games."

To cut to the chase, the report "calls on Internet industry to sharpen up its act to protect users from harmful content." It doesn't necessarily recommend government regulation, but it doesn't rule it out either--especially if the committee feels that industry self-regulation continues to miss the mark following the report's recommendations.

"The potential risks to consumers, including children and young people, from exposure to harmful content on the Internet or in video games. The Committee is particularly interested in the potential risks posed by:
  • "Cyber bullying";
  • user generated content, including content that glorifies guns and gang violence;
  • the availability of personal information on social networking sites;
  • content that incites racial hatred, extremism or terrorism;
  • content that exhibits extreme pornography or violence;"

The report claims "99% of children accessed the Internet, most often at home and at school." As such, the committee feels a strong need to protect children from potential harm, even if the content in question has not actually been proven to be harmful:

"We agree that any approach to the protection of children from online dangers should be based on the probability of risk. We believe that incontrovertible evidence of harm is not necessarily required in order to justify a restriction of access to certain types of content in any medium."

 


The three areas of risk that the report identifies are: Content-based risks, such as from pornography and violence; contact-based risks, such as from sexual predators; and conduct-based risks, such as from cyber-bullying.

The specific self-regulation recommendations that the report makes to the industry for "controlling" these content-based risks are:
  • Designation of specific types of content as illegal
  • Removal of specific types of content for non-compliance with terms of use policies
  • Flagging specific types of content
  • Preview and review of user-generated content
  • Improving take-down times of inappropriate content
  • Implementing filtering software
  • Implementing age verification

The report further goes on to state:

"We strongly recommend that terms and conditions which guide consumers on the types of content which are acceptable on a site should be prominent. It should be made more difficult for users to avoid seeing and reading the conditions of use: as a consequence, it would become more difficult for users to claim ignorance of terms and conditions if they upload inappropriate content."

The report is perhaps most scathing, however, of those content providers that do not adequately pre-screen content:

"It is not standard practice for staff employed by social networking sites or video-sharing sites to preview content before it can be viewed by consumers. It was put to us that to pre-screen all material before it was published on the Internet would be impractical, because of the sheer volume of material being uploaded. In the case of YouTube, this is approximately 10 hours of video every minute. Instead, YouTube relies upon "millions of active users who are vocal when it comes to alerting us to content they find unacceptable or believe may breach our policies". Google (which owns YouTube) also told us that "we don't, and can't, review content before it goes live, anymore than a telephone company would screen the content of calls or an ISP would edit e-mails". We are not convinced that this is a valid analogy: a person who makes a telephone call or who sends an e-mail does so in the expectation that the content will normally remain private. Content uploaded to many websites is generally intended for public notice and may be accessible by a person of any age in almost any part of the world."

For "controlling" contact-based risks, the report recommends:
  • Including "Report Abuse" buttons
  • Increased moderation of interactive sites

For "controlling" conduct-based risks the reports suggests limiting the amount of time children can access the Internet and play games, such as using technologies similar to the XBox 360's parental control-based Family Timer feature.

While some of these practices are already in place, the committee feels that "leaving individual companies in the Internet services sector to regulate themselves in the protection of users from potential harm has resulted in a piecemeal approach which we find unsatisfactory." The committee wants to see a bigger push from the industry towards self-regulation, and adds even additional pressure with the thinly veiled threat of government regulation:

"We do not believe that statutory regulation should be the first resort. Instead, we propose a tighter form of self-regulation, applied across the industry and led by the industry. We therefore call on the industry to establish a self-regulatory body which would agree minimum standards based upon the recommendations of the UK Council for Child Internet Safety, monitor their effectiveness, publish performance statistics and adjudicate on complaints."

And...

"Although we have identified a role for the Government in bringing forward legislation to define certain types of content on the Internet as illegal, we do not believe that, in general, this is a field in which the Government should be very closely involved. Its chief role should be to ensure that bodies with a regulatory, advisory or law enforcement role in protection from harmful content on the Internet and in video games have priorities which are in line with Government policy and are resourced to carry out their duties effectively."

"Controlling" content is a debate that is not limited to the U.K. Similar debates in the U.S. continue to rage on, as some call for increased controls in the form of government regulation or industry self regulation; while others argue that any form of controlling content amounts to censorship and is a potential violation of the First Amendment.

The U.S. has had very limited success with government regulation of online content with the partially-overturned Communications Decency Act, and the Children's Internet Protection Act. Meanwhile, organizations such as the Family Online Safety Institute try to promote best practices for industry self-regulation. Political pundits report that U.S. Democratic presidential candidate, Senator Barack Obama is an advocate of government regulation of the Internet, while U.S. Republican presidential candidate, Senator John McCain is a proponent of industry self-regulation.

It's a safe bet that this debate will continue on not only in the U.K. and the U.S., but in other countries as well. In fact, not only is China blocking Internet access to certain "sensitive" sites leading up to and during the Beijing Olympics, but the International Olympic Committee (IOC) has accepted this censorship:

"IOC officials negotiated with the Chinese [so] that some sensitive sites would be blocked on the basis they were not considered Games-related," said Kevin Gosper, chairman of the IOC's press commission, according to press reports. "I regret that it now appears BOCOG [the Beijing Organizing Committee for the Games of the XXIX Olympiad] has announced that there will be limitations on Web site access during Games time," he added."

If some governments and organizations have their way, perhaps you won't be able to find virtually anything on the Internet. Is protecting our values and our children worth the price of potentially curtailing our freedoms? Is access to online content a right or a privilege?