Russian crackers throw GPU power at passwords

Russian-based cracking "password recovery" company Elcomsoft hasn't really been in the news since 2003, when Adobe helped make "Free Dmitry" the new "Free Kevin" by having one of the company's programmers, Dmitry Sklyarov, arrested for cracking its eBook Reader software. But Elcomsoft has remedied the lack of press attention this week with its announcement that it has pressed the GPU into the service of password cracking. HangZhou Night Net

With NVIDIA and AMD/ATI working overtime to raise the GPU's profile as a math coprocessor for computationally intensive, data-parallel computing problems, it was inevitable that someone would make an announcement that they had succeeded in using the GPU to speed up the password-cracking process. Notice that I said "make an announcement," because I'm sure various government entities domestic and foreign have been working on this from the moment AMD made its "close-to-metal" (CTM) package available for download. The Elcomsoft guys didn't use CTM, though. They opted to go with NVIDIA's higher-level CUDA interface, a move that no doubt cut their development time significantly.

Elcomsoft's new password cracker attacks the NTLM hashing that Windows uses with a brute force method. The company claims that its GPU-powered attack speeds up the time it takes to crack a Vista password from two months to a little over three days.

Elcomsoft claims that they've filed for a US patent on this approach, but it's not clear what exactly they're attempting to patent. A search of the USPTO's patent database turned up nothing, but that could be because the patent hasn't made it into the database yet.

Ultimately, using GPUs to crack passwords is kid's stuff. The world's best password cracker is probably the Storm Worm, assuming that its owners are using it for this. As many as ten million networked Windows boxes—now that's parallelism.

Climate change mega-post

This week there seems to be a lot of climate news around, some good, some bad, and some that is just ugly. Rather than putting up a plethora of posts and getting accused of being Ars Climactica, we thought we would combine them into a single mega post for your consumption. HangZhou Night Net

The first paper, published in Science1, looks at the prospects for narrowing the range of estimates for the future climate. In doing so, they note that the climate is a system that consists of many physical processes that are coupled together nonlinearly. This has led to climate modelers focusing on physical mechanisms and fundamentals of nonlinear dynamics to understand and improve their models. Notably, the specific inclusion of many physical mechanisms has not led to a significant decrease in the range of climate predictions. Most of the blame for this has fallen on the nature of nonlinear systems. Essentially, to obtain a small increase in predictive ability, one needs a very large increase in the accuracy of the initial conditions. We are stuck because we can’t improve the accuracy of our ancestor’s weather stations and other methods, such as ice core samples, will only ever yield averages. But as our earlier coverage on the nature of climate modeling explains, this isn’t really the heart of the issue. Climate models use a range of initial conditions and measure the probability of certain climatic conditions occurring based on those modeling results.

Instead of focusing on the physics of the climate or the dynamical system, Roe and Baker look at the behavior of a simple linear equilibrium system with positive feedback. All the physics is replaced with a simple gain parameter, which describes how an increase in average temperature leads to a further increase in temperature. Although this does not describe the physics, it does encompass what we measure, so the model is valid for their purposes. They then explore how the uncertainty in the gain parameter changes the rate of temperature increase. The positive feedback system has the effect of amplifying the uncertainties (just like a nonlinear system), meaning that it is practically impossible to improve climate estimates. This is not really derived from the initial conditions (e.g., the starting climatic conditions) but rather focuses on the natural uncertainty in physical mechanisms, which is a major focus of current modeling efforts and includes such things as cloud cover. Basically, the amplifying of the uncertainties, and the timescales involved mean that the smallest uncertainties blow out to give the large range of temperatures predicted by climate researchers.

This news will not call off the search for parts of the environment that influence our climate because, if we are to mitigate global warming, then we must know which bits of the environment are the best to change. This obviously includes human behavior, but that covers a whole gamut from urban lifestyles through to farming practices. Part of this picture is soil erosion, which removes carbon from the soil and deposits it elsewhere. The question isn’t so much as where but what happens to that carbon on route and once it arrives. It was thought that perhaps soil erosion contributed carbon dioxide to the atmosphere by opening up new mechanisms for the decomposition of organic matter. Alternatively, it has been argued that soil erosion deposits organic carbon in places—like the bottom of the sea, for instance— where it is effectively stored. However, testing these hypotheses has been problematic.

Nevertheless, problematic is what a good scientist looks for, so, with fortitude and dedication to the cause, scientists from the EU and US have collaborated to measure the uptake and removal of carbon over ten sites. They report in Science2 this week that, like normal land, eroding land also acts as a carbon sink. They do note that in eroding landscapes, the carbon is likely to more laterally more, but is no more likely to enter the atmosphere as carbon dioxide than on healthy pastureland. Of course the amount of carbon stored is slightly less, so these soils are perhaps not as efficient as normal soils as carbon sinks. Some research is needed to determine if there are differences in the long-term destination of carbon between normal pasture and eroding soils—however, until that research is done, we can cross soil erosion off the list of things to worry about in terms of global warming.

On the bad news, rapid industrialization in the developing world and the lack of action in the developed world is now measurably increasing the rate at which we deposit carbon dioxide into the atmosphere. This is the conclusion of a paper to be published in the Proceedings of the National Academy of Science. Essentially, they have looked at estimates for anthropogenic carbon dioxide emissions and compared that to the measured concentration in the atmosphere and determined from the time series that the natural carbon sinks are either already saturated or are nearing saturated. The conclusion from this is that the concentration of carbon dioxide in the atmosphere is likely to increase faster than predicted in most scenarios. This is especially true since most scenarios assume that we will take some action to keep the rate of increase in atmospheric carbon dioxide (as a percentage) below the rate of economic growth (also as a percentage). Not the best news.

Electronic Arts to undergo empire-wide restructuring, layoffs

When you're on top, the only place to go is down. In the face of stiff competition, EA's profits have begun to drop. Destructoid is reporting that job cuts and branch restructuring have already begun taking place, with extensive changes being made to many different studios under EA's umbrella, including Mythic. HangZhou Night Net

Word of these changes came from an internal EA e-mail. CEO John Riccitiello has begun taking precautions to ensure that the current state of affairs of his company doesn't continue. This follows a previous restructuring meant to rebalance staff across the many branches of the company. To quote the e-mail:

Given this, John Riccitiello, our CEO, has tasked the company to get its costs in line with revenues… Every studio, group and division of the company has been tasked to review its overall headcount and adjust its organization to meet the needs of the business moving forward.

The changes to Mythic appear to be only the first in what will be a long line of changes. Certain teams, such as the Ultima Online group, will be relocated. Competitive employment strategies will also be enforced to keep employees working hard if they want to keep their jobs: "attrition, performance management, stricter hiring guidelines, and layoffs" will purportedly keep workers in check.

Given the state of EA's multiplatform competitors, including Activision, which is set to release one of the assured hits of the winter in Call of Duty 4, and long-time rival Ubisoft, which is sitting on Assassin's Creed, the company will be pressed to start taking more risks like skate if it hopes to stay fresh in this increasingly competitive development scene.

Microsoft antes up $240 million for a piece of the Facebook action

All of the recent flirting between Facebook and Microsoft has turned into hot equity action, as the two companies have announced that Microsoft will make a $240 million investment in the social networking site. In addition, Microsoft will begin selling ads for Facebook outside of the US and will become the site's exclusive ad provider in the US. HangZhou Night Net

Facebook's value is not in the software itself—which could be duplicated relatively easily by a small group of programmers—but in the vast social networks the site has gathered, networks that contain information about people's interests and desires that would be invaluable for any marketing company.

Launched in early 2004, Facebook was originally targeted to college students, limiting registrations to those with a .edu e-mail address. The company opened the registration doors to all comers in September 2006, and the move appears to have paid off: the site is drawing an average of 250,000 new registered users every day, according to Facebook. Facebook now has over 49 million active users, according to VP of operations Owen Van Natta.

Just a couple of weeks before removing the college-students-only registration limitation, Facebook and Microsoft inked an advertising pact that made Microsoft the exclusive banner ad provider. The companies extended that agreement through 2011 earlier this year.

Google had also been rumored to be courting Facebook, but Microsoft appeared determined to close the deal. Google already has an exclusive $900 million pact with MySpace to provide that site—and other Fox Interactive Media properties—with contextual ads and search services. Yahoo has also courted Facebook in the past, but the $750 million to $1 billion offers were apparently not enough to scratch Facebook's financial itch.

Microsoft's $240 million investment is part of a new round of financing for Facebook, one that places a $15 billion valuation on the company.

PodSleuth to bring better iPod support to Linux

HangZhou Night Net

Banshee developer Aaron Bockover announced the PodSleuth project earlier this week, which is designed to expose iPod metadata through the Linux Hardware Abstraction Layer (HAL). PodSleuth replaces the old libipoddevice and is designed to be more adaptable and future-proof.

PodSleuth metadata will be merged into the iPod’s HAL device representation as properties so that the information can easily be accessed by any application that can interact with HAL. PodSleuth uses information extracted directly from plists on the devices and only relies on the model table to ascertain “cosmetic” distinctions, so devices that aren’t registered in the model table will still be supported. PodSleuth will provide an icon metatadata property through HAL for devices that are listed in the model table, enabling the proper icon for known iPod devices to be displayed in Banshee and Nautilus.

PodSleuth is currently available from the GNOME version control system, but is still under heavy development. An initial release is expected to take place next week along with a new version of ipod-sharp and Banshee 0.13.2. These releases will bring support for the new iPods to Banshee.

In a blog entry, Bockover also addresses criticisms of his choice to use C# as the programming language for PodSleuth. He points out that PodSleuth is a HAL service and not a library, which means that other programs don’t have to be written in C# to use the functionality. PodSleuth also only uses the ECMA approved portions of Mono, which means that it doesn’t rely on any patent-encumbered code.

Apple’s attempts to lock iPod users into iTunes have been unsuccessful and impressive open source software solutions continue to provide strong alternative music management options for current iPod owners. Despite the availability of iPod support on Linux, the need to constantly reverse engineer and hack around Apple’s lock-in mechanisms makes the iPod a poor choice for Linux users. There is no guarantee that Apple’s antihacker efforts will be so easily overcome in future firmware revisions. Linux users should still consider buying alternate products that support open standards.

Seagate customers eligible for manufacturer refunds, free software

Back in 2005, a woman named Sara Cho sued Seagate alleging that the company's use of binary when reporting hard drive sizes constituted false advertising. If you're not familiar with the difference between how the hard drive industry measures a gigabyte vs. how everyone else measures a gigabyte, it boils down to this. HDD manufacturers (including Seagate, Western Digital, Samsung, and Hitachi) define one gigabyte as one billion bytes. In other market areas, however, a gigabyte is technically defined as 1.074 billion bytes—a difference of 7.4 percent. The gap between the two measurements has grown along with hard drive capacities, at the one terrabyte level the gap increases to ~10 percent. HangZhou Night Net

According to details posted at the settlement website, Seagate has agreed to issue a refund equal to five percent of a drive's original purchase price, provided the hard drive was bought between March 22, 2001 and September 26, 2007. Alternatively, customers can request a free set of Seagate's backup and recovery products (valued at $40). Seagate has agreed to this settlement despite denying any liability (and all of Cho's claims). The settlement must still be approved by the presiding judge and no ruling regarding the merits of the case has been given.

In order to submit a claim, buyer's must fill out either an online claim form (for free software) or a mail-in claim form if you actually want the five percent refund. Drive serial numbers, merchant identification, and the month, date, and year of the purchase are all required for either form, so if you've already tossed the drive or don't remember when you bought it or who you bought it from, you're unfortunately out of luck.

As for the merits of Cho's case, I can see her point—but her failure to win any real concessions from Seagate regarding product labeling means that the problem will continue to occur. What might've seemed trivial at one megabyte becomes a notable loss at one terrabyte, though I have to admit that I don't plan on taking to the streets over the issue. It's quite possible, however, that Cho's settlement (if approved) will open the door for similar actions against other major hard drive manufacturers.

Simple Turing machine shown capable of solving any computational problem

A proof made public today illustrates thatStephen Wolfram's 2,3 Turing machine number 596440 is a universal Turing machine, and ithas netted a University of Birmingham undergraduate $25,000. In 1936, mathematician Alan Turing proposed a machine that was the original idealized computer. A small subset of these Turing machines are known as Universal Turing machines; they are capable of solving any computational problem known. In May, mathematician Stephen Wolfram put forth the challenge to amateur and professional mathematicians alike to determine if one of the Turing machines listed in his book, "A New Kind of Science," was indeed universal. HangZhou Night Net

Turing machines aresimple logic devices that can be made to simulate the logic of any standard computer that could be constructed. They consist of an infinite number of cells on a tape (the memory) and an active cell that is referred to as the "head." Each cell can be one of a set number of colors, and the head can have a fixed number of states. A set of rules determine how the combination of cell color and head state dictates what color should be written to the tape, what state the head should be placed in, and what direction it should move (left or right).

2,3 Turing Machine, number 596440
in Wolfram's numbering scheme

On the fifth anniversary of the publication of "A New Kind of Science," Wolfram issued a challenge, namely, "how simple can the rules for a universal Turing machine be?" In order to spur interest in this, he offered up a $25,000 prize for anyone who could prove, or disprove, that the 2-state, 3-color Turing machine illustrated at right is universal. Finding small universal Turing machines is not a major problem in modern computer science or mathematics. According to MIT computer scientist Scott Aaronson, "Most theoretical computer scientists don't particularly care about finding the smallest universal Turing machines. They see it as a recreational pursuit that interested people in the 60s and 70s but is now sort of 'retro.'" However, people on Wolfram's prize committee hoped this would spur new work.

While not the $1,000,000prize attached to the Clay Millennium problems, it spurred interest in at least one person. 20-year-old Alex Smith, an electronic and computer engineering student at the University of Birmingham in the UK, has solved the problem and will receive the award money.

Smith said he first heard about the problem in an Internet chat room, decided the problem was interesting, and attempted to tackle it. His proof was not direct; he demonstrated that the 2,3 Turing machine was computationally equivalent to a tag machine—something that is already known to be universal. In addition to his proof—available for free (PDF)—he developed a "compiler" that would generate 2,3 Turing machine code that is capable of solving any computational problem. According to Smith, he has no big plans for the prize money. "I'm just going to put it in the bank," he said.

Small plans: NVIDIA and the future of smartphones

It’s the last two decades all over again

The past two decades of PC history have been about desktops, servers, and laptops, but the "personal computer" of the coming decade is a small, pocket- or purse-sized device with a brightly lit screen, wireless networking and I/O, a sizable chunk of storage, and plenty of CPU and GPU horsepower on board. In short, you might say that the iPhone is the Macintosh 128K of the post-PC era, the 2008 lineup of Intel-based mobile products are the IBM PC XT, and we're all about to relive the 80s and 90s (complete with a brand new RISC versus CISC faceoff) but on a much smaller scale and in a more compressed timeframe. HangZhou Night Net

Over the past few weeks, I've told you a bit about Intel's plans for this coming wave of pocket-sized personal computers: Silverthorne/Poulsbo will bring high-powered x86 hardware down into the ultramobile PC (UMPC) form factor in 2008, followed by the even smaller 32nm Moorestown chip that will be Intel's first full-fledged x86 media SoC and which could possibly be the future brains of Apple's iPhone. But I haven't yet told you about Intel's competition.

NVIDIA, AMD/ATI, ARM, and other powerhouses in the PC and embedded spaces aren't sitting idly by while Intel takes direct aim at what will be one of the hottest new battlegrounds of the post-PC era: your pocket. In the coming days, I'll tell you what each of these companies is up to, starting with NVIDIA.

"It is ultimately a computer that ends up in your pocket"

I recently had a series of exchanges with NVIDIA, including a free-ranging chat with Mike Rayfield, the general manager of NVIDIA's mobile group, about NVIDIA's plans for handheld devices. Like the rest of the technology industry, NVIDIA has been closely watching the smartphone space in general and the iPhone launch in particular, and the company has learned a few things both from Apple and from their own experience with the GoForce line of media SoCs.

The first lesson of the emerging mobile market is this: desktop PCs are about applications and performance, but handheld devices are about functionality and features. And on the list of important handheld features, the ability to make a voice call has gone from the top to somewhere near the bottom in the post-iPhone era.

"Historically, the handset market has been all about making a phone call," said Rayfield. "When you see advertisements for every phone but the iPhone, it's all about showing the form factor of the phone, and what color it is, or what size it is. It's basically an industrial design advertisement, or an advertisement by the network saying that your calls won't get dropped."

"The iPhone was the first one where, when you see the ad, you're actually looking at the phone doing something. The last thing they show you on the advertisement is making a phone call. So we believe that's reflecting what's happening in the industry, that these handheld devices are ultimately becoming your most personal computer. It is ultimately a computer that ends up in your pocket."

Repair service dubs Apple most reliable, Lenovo takes second

In a recent satisfaction survey, Apple scored highly in the reliability, tech support, and repair categories. That was an opinion survey, though, and not necessarily a scientific measure. As a more quantitative measure of manufacturer reliability, a third-party repair company called RESCUECOM has released its yearly Computer Reliability Report, which also puts Apple in the top spot for reliability. HangZhou Night Net

The survey methodology calculates a reliability index for each manufacturer, which is based on the percentage of calls for each manufacturer's products made to RESCUECOM and then compared to the average Q2 market share for that manufacturer. Apple finished first with a score of 357, meaning that the percentage of repair calls was only about one-third of the estimated market share percentage of 5 percent. Lenovo dropped down to second place—its score was over 100 points lower than that of Apple.

Although this looks like a mark in the 'win' category for Apple, I'm not sure a report from a third-party repair service is the best indicator of reliability. My biggest issue with this report is the fact that calls to RESCUECOM may not be indicative of overall reliability—I would imagine that if anyone has an issue with a new or warrantied computer, that person would call AppleCare or the other manufacturer first, leaving RESCUECOM out of the picture. Even if something isn't under warranty, knowledgeable friends or local computer stores might get tapped for the repairs.

So, yes, the survey results could mean Apple has great overall reliability. But then again, the report only tells us about repair calls made to RESCUECOM, which could be a fairly small subset of the overall reliability picture. I think the numbers are still good for Apple, but we'd advise you to take the results with a few grains of salt.

Apple quietly disposes of Classic in Leopard

If, like me, you came to the Mac in the past five years or so, you may not be aware of the fact that “Mac OS” wasn’t always followed by the letter “X”—sorry, the number ten. But the Mac did have a long and illustrious life before it gained its current UNIX underpinnings. Part of this legacy has lived on—on PowerPC Macs, at least—in the Classic environment. HangZhou Night Net

Classic is to Mac OS 9 what Parallels is to Windows in Coherence mode: it runs a separate operating system in the background and makes the applications that run under that OS (Mac OS 9.2 in the case of Classic) seem just like native applications. As I said, I never used a pre-OS X Mac, so I never had any legacy applications that require Classic. However, a year ago, I went on a browser downloading spree and tried to find the oldest possible browsers that still worked. It was kind of fun to see Netscape 3.0 start and immediately crash and burn because of the multitude of Javascript errors on the netscape.com homepage.

Classic has never been available on Intel Macs, but as of Leopard, PowerPC Macs will also have to do without Classic, according to this Apple support page, which I’ll quote for you in its entirety:

Classic applications do not work on Intel processor-based Macs or with Mac OS X 10.5.

Upgrade your Mac OS 9 applications to Mac OS X versions. Check with an application’s manufacturer for more information.

So, if you have a PowerPC Mac that still has some Classic applications on it and you intend to go out and get the new cat tomorrow, I suggest that tonight, you pour a tall glass of your beverage of choice. Then, sit down in front of your Mac, and start those trusty Mac OS 9 applications up one more time, and remember the good times. After that, start your overnight full backup and turn in early, because Leopard will be your middle-aged PowerPC Mac’s last chance to be a youthful kitten again: few people expect the next Mac OS X after 10.5 to run on the PowerPC.

Judge: Educational privacy law not sufficient to block RIAA’s subpoenas

In August, we reported on a University of Tennessee student targeted by the RIAA for file-sharing who had attempted to quash a subpoena by arguing that the Federal Educational Rights and Privacy Act (FERPA) prevented the release of his name, addresses, and phone numbers. The identity of Doe 28 in Virgin v. Does 1-33 will soon be known to the RIAA, as a judge denied the student's motion to quash the subpoena. HangZhou Night Net

Doe 28 had argued that the RIAA's request was "unreasonable on its face" and that it should be denied because he had also not waived his right to privacy under FERPA. The privacy law bars the release of educational records without the consent of the students or parents, but "directory information," such as the student's name, address, phone number, and e-mail address can be released without permission.

Most of the information sought by the RIAA "falls within the category of Directory Information under FERPA, which according to the university’s policy, does not require defendant's consent to provide to a third party," wrote Magistrate Judge H. Bruce Guyton in his opinion.

Judge Guyton also ruled that the school must provide the RIAA with Doe 28's MAC address, which, as the Jammie Thomas trial demonstrated, will be used by the RIAA to tie specific computers to file-sharing activity. A computer's MAC address doesn't qualify as an "educational record" under FERPA, according to the judge, and is therefore not protected.

Since the RIAA began ratcheting up its battle against on-campus file sharing last spring, the industry has filed hundreds of lawsuits against college students, all via the John Doe route. In contrast to the thousands of other P2P lawsuits, the RIAA has run into some roadblocks in its attempts to finger college students for copyright infringement. Some judges have blocked the RIAA's ex parte discovery, with one ruling that the 1984 law cited by the RIAA as the authority for its ex parte subpoenas didn't give the record labels that authority after all.

Still, it's going to be difficult for college Does to hide behind FERPA. As Judge Guyton pointed out, the info sought by the RIAA is not protected by the law, so it's likely that other motions to quash based on FERPA will also fail.

Motorstorm upgrade adds rumble support

Motorstorm is still cranking along as one of the PlayStation 3's better racing games, and a new update available today in Europe will make things even better by adding rumble support. Of course, we still have to wait until next year to actually get our DualShock 3s, but with the Ratchet and Clank Future disc including the PS3 1.94 firmware update that adds rumble support, I wouldn't be surprised if we saw a healthy market for import controllers. HangZhou Night Net

What else does the Motorstorm update add?

Added vibration support for DUALSHOCK 3 controller with adjustable sensitivity settings.Grid order has been randomised for first race in any online lobby.Finishing positions in your last race now determine your starting grid position in the next race within the same online lobby.On-screen indicator has been added to show whether voice comms are issuing from TV or headset – As before, please press and hold L1 (R1 if using control scheme 'B') to toggle headset output through TV or Headset.Sensitivity Settings for SIXAXIS™ Motion Sensor control have been added.'Gloating Index' has been added to the Stats card – The Gloating index gives guidance as to a racer's online prowess. It takes into account the number of people you’ve beaten in each race and the number of people that have beaten you. Use the gloating index to spot the players to beat in each race! Try and rank your Gloating Index up to the perfect '10'!Several causes for an occasional snagging issue which would destroy vehicles on suspension impact have been addressed.Further fixes to prevent the occasional statistics reset issue have been applied.Fixed an occasional issue with inaccuracies in Eliminator finishing results.Fixed an issue where winners leaving Eliminator before race completed could cause issues for other players in lobby.

Quite an extensive update, and I'm glad to see the game continue to have strong support. With games getting updated with rumble support, and new games being released that already include rumble, the wait for the DualShock 3 just got a little longer.

I'm not seeing the update yet in America, but I'm hoping we get it very soon.

Half of all Americans support government regulation of Internet video

24 percent of Americans say that the Internet can function, for a short time, as a replacement for a significant other. 20 percent are open to "chipping" their own children in order to track their location. 11 percent would be willing to surgically implant a device inside their brains that would allow direct mental access to the Internet. Does that make us, collectively, a crazy-ass country? Or does it say something profound about the current state of polling? HangZhou Night Net

The results come from a new poll on Internet attitudes done by 463 Communications and Zogby International, whichalso revealed that young people of both genders find Scarlett Johansson sexier than an iPhone. (This is, in fact, the correct answer.)

The poll did reveal a more interesting result as well, namely that three-quarters of those over 70 support government regulation or a rating system for online video. That number decreases sharply as the age range decreases, but more than half of all Americans thought that the government should be involved in at least some form of Internet video monitoring.

This is the sort of finding thatfeels like a stake through the heart to libertarians. Writing at the Tech Liberation Front, Adam Thierer of the Progress & Freedom Foundation argued that policymakers will eventually jump on the lightly regulated Internet and start gorging their regulatory appetites, saying to themselves, "We must grow regulation! We must expand the tentacles of the regulatory state to include all those new technologies of freedom! We cannot let people think and act for themselves!"

In case it wasn't clear from the quote, Thierer considers this a Bad Thing.

Certainly, there's reason to think that such a scenario will eventually play out in Washington. The FCC is currently trying to decide if it can come up with a working definition for violent content in order to regulate it on both broadcast TV and even cable. Sexual content and language are already regulated by the agency, and as the Internet becomes an increasingly-capable substitute for (or complement to) television, calls for ratings and regulation will no doubt arise.

Such calls have been resisted successfully by videogame makers, who have used a voluntary rating system to head off increased government controls. But Internet video is already coming under attack around the world, and it does raise some provocative questions for society. Should user-generated content sites, for instance, block the viewing of clips of vandalism or violence? UK school officials are already trying to address the issue after vandals began posting their work on YouTube.

And then there are privacy concerns. Few broadcast networks (at least in the US) would show video of a couple cavorting in the ocean, buta clip of a Brazilian model having sex on the beach with her boyfriend was widely available through YouTube. The case even prompted a Brazilian ban on the site for some time.

How these issues will play out in the US is anyone's guess, but at some point, talk of regulation and ratings will probably enter the picture. When that happens, it appears that a sizable number of Americans will support it.