Russian crackers throw GPU power at passwords

Russian-based cracking "password recovery" company Elcomsoft hasn't really been in the news since 2003, when Adobe helped make "Free Dmitry" the new "Free Kevin" by having one of the company's programmers, Dmitry Sklyarov, arrested for cracking its eBook Reader software. But Elcomsoft has remedied the lack of press attention this week with its announcement that it has pressed the GPU into the service of password cracking. HangZhou Night Net

With NVIDIA and AMD/ATI working overtime to raise the GPU's profile as a math coprocessor for computationally intensive, data-parallel computing problems, it was inevitable that someone would make an announcement that they had succeeded in using the GPU to speed up the password-cracking process. Notice that I said "make an announcement," because I'm sure various government entities domestic and foreign have been working on this from the moment AMD made its "close-to-metal" (CTM) package available for download. The Elcomsoft guys didn't use CTM, though. They opted to go with NVIDIA's higher-level CUDA interface, a move that no doubt cut their development time significantly.

Elcomsoft's new password cracker attacks the NTLM hashing that Windows uses with a brute force method. The company claims that its GPU-powered attack speeds up the time it takes to crack a Vista password from two months to a little over three days.

Elcomsoft claims that they've filed for a US patent on this approach, but it's not clear what exactly they're attempting to patent. A search of the USPTO's patent database turned up nothing, but that could be because the patent hasn't made it into the database yet.

Ultimately, using GPUs to crack passwords is kid's stuff. The world's best password cracker is probably the Storm Worm, assuming that its owners are using it for this. As many as ten million networked Windows boxes—now that's parallelism.

Climate change mega-post

This week there seems to be a lot of climate news around, some good, some bad, and some that is just ugly. Rather than putting up a plethora of posts and getting accused of being Ars Climactica, we thought we would combine them into a single mega post for your consumption. HangZhou Night Net

The first paper, published in Science1, looks at the prospects for narrowing the range of estimates for the future climate. In doing so, they note that the climate is a system that consists of many physical processes that are coupled together nonlinearly. This has led to climate modelers focusing on physical mechanisms and fundamentals of nonlinear dynamics to understand and improve their models. Notably, the specific inclusion of many physical mechanisms has not led to a significant decrease in the range of climate predictions. Most of the blame for this has fallen on the nature of nonlinear systems. Essentially, to obtain a small increase in predictive ability, one needs a very large increase in the accuracy of the initial conditions. We are stuck because we can’t improve the accuracy of our ancestor’s weather stations and other methods, such as ice core samples, will only ever yield averages. But as our earlier coverage on the nature of climate modeling explains, this isn’t really the heart of the issue. Climate models use a range of initial conditions and measure the probability of certain climatic conditions occurring based on those modeling results.

Instead of focusing on the physics of the climate or the dynamical system, Roe and Baker look at the behavior of a simple linear equilibrium system with positive feedback. All the physics is replaced with a simple gain parameter, which describes how an increase in average temperature leads to a further increase in temperature. Although this does not describe the physics, it does encompass what we measure, so the model is valid for their purposes. They then explore how the uncertainty in the gain parameter changes the rate of temperature increase. The positive feedback system has the effect of amplifying the uncertainties (just like a nonlinear system), meaning that it is practically impossible to improve climate estimates. This is not really derived from the initial conditions (e.g., the starting climatic conditions) but rather focuses on the natural uncertainty in physical mechanisms, which is a major focus of current modeling efforts and includes such things as cloud cover. Basically, the amplifying of the uncertainties, and the timescales involved mean that the smallest uncertainties blow out to give the large range of temperatures predicted by climate researchers.

This news will not call off the search for parts of the environment that influence our climate because, if we are to mitigate global warming, then we must know which bits of the environment are the best to change. This obviously includes human behavior, but that covers a whole gamut from urban lifestyles through to farming practices. Part of this picture is soil erosion, which removes carbon from the soil and deposits it elsewhere. The question isn’t so much as where but what happens to that carbon on route and once it arrives. It was thought that perhaps soil erosion contributed carbon dioxide to the atmosphere by opening up new mechanisms for the decomposition of organic matter. Alternatively, it has been argued that soil erosion deposits organic carbon in places—like the bottom of the sea, for instance— where it is effectively stored. However, testing these hypotheses has been problematic.

Nevertheless, problematic is what a good scientist looks for, so, with fortitude and dedication to the cause, scientists from the EU and US have collaborated to measure the uptake and removal of carbon over ten sites. They report in Science2 this week that, like normal land, eroding land also acts as a carbon sink. They do note that in eroding landscapes, the carbon is likely to more laterally more, but is no more likely to enter the atmosphere as carbon dioxide than on healthy pastureland. Of course the amount of carbon stored is slightly less, so these soils are perhaps not as efficient as normal soils as carbon sinks. Some research is needed to determine if there are differences in the long-term destination of carbon between normal pasture and eroding soils—however, until that research is done, we can cross soil erosion off the list of things to worry about in terms of global warming.

On the bad news, rapid industrialization in the developing world and the lack of action in the developed world is now measurably increasing the rate at which we deposit carbon dioxide into the atmosphere. This is the conclusion of a paper to be published in the Proceedings of the National Academy of Science. Essentially, they have looked at estimates for anthropogenic carbon dioxide emissions and compared that to the measured concentration in the atmosphere and determined from the time series that the natural carbon sinks are either already saturated or are nearing saturated. The conclusion from this is that the concentration of carbon dioxide in the atmosphere is likely to increase faster than predicted in most scenarios. This is especially true since most scenarios assume that we will take some action to keep the rate of increase in atmospheric carbon dioxide (as a percentage) below the rate of economic growth (also as a percentage). Not the best news.

Electronic Arts to undergo empire-wide restructuring, layoffs

When you're on top, the only place to go is down. In the face of stiff competition, EA's profits have begun to drop. Destructoid is reporting that job cuts and branch restructuring have already begun taking place, with extensive changes being made to many different studios under EA's umbrella, including Mythic. HangZhou Night Net

Word of these changes came from an internal EA e-mail. CEO John Riccitiello has begun taking precautions to ensure that the current state of affairs of his company doesn't continue. This follows a previous restructuring meant to rebalance staff across the many branches of the company. To quote the e-mail:

Given this, John Riccitiello, our CEO, has tasked the company to get its costs in line with revenues… Every studio, group and division of the company has been tasked to review its overall headcount and adjust its organization to meet the needs of the business moving forward.

The changes to Mythic appear to be only the first in what will be a long line of changes. Certain teams, such as the Ultima Online group, will be relocated. Competitive employment strategies will also be enforced to keep employees working hard if they want to keep their jobs: "attrition, performance management, stricter hiring guidelines, and layoffs" will purportedly keep workers in check.

Given the state of EA's multiplatform competitors, including Activision, which is set to release one of the assured hits of the winter in Call of Duty 4, and long-time rival Ubisoft, which is sitting on Assassin's Creed, the company will be pressed to start taking more risks like skate if it hopes to stay fresh in this increasingly competitive development scene.

iTunes Store boosts indie offerings while Hollywood still holds out

It's no secret that the iTunes Store's movie selection is lagging behind the soaring success of the music department. While the store has steadily risen up the top music retailer charts (currently holding the number 3 spot under Best Buy and Wal-Mart), the iTunes Store's best-selling movie list includes not much more than a few Disney/Pixar hits and classics like The Princess Bride and Zoolander. HangZhou Night Net

In other words: the movie shelves aren't exactly stocked with fresh goods. Obviously the iTunes Store can't simply wait around for Hollywood to pull its head out of its ass, so the store is looking to expand its horizons into an industry that we reported back in February was ripe for digital distribution: independent film.

The iTunes Store will now debut Purple Violets on November 20, a $4 million film from indie director Edward Burns. It will be the first time a feature film debuts exclusively on the iTunes Store, according to the New York Times, and the latest step towards injecting life into the store's struggling movie section. For now, it seems as though only indies are steadily hopping on board and singing the praises of the iTunes Store.

While the "little guys" (as they're called by Apple's iTunes VP Eddy Cue) are thrilled to gain the visibility the iTunes Store provides, the store still doesn't seem to be making headway on the larger challenge it faces: getting the major studios to stop quibbling over outlandish DRM and prices for digital content. Hollywood got spooked by the sudden and rampant spread of music piracy, but now it seems to have a hard time learning from the music industry's turnaround success with legitimate digital distribution outlets.

Ultimately, it looks like Hollywood and the iTunes Store are likely to be stuck in a stalemate for some time. Hollywood wants higher prices (despite making even more money per digital download than with DVDs) and crappier restrictions. In Apple's favor, however, most consumers typically don't jump ship after they've invested time and money in products like the iPod and iTunes Store—the current DRM that still exists on much of the store's music and all of its video will help keep those consumers in their seats for some time. Our best hope for now is that a growing indie film segment in the iTunes Store will drag the movie dinosaurs out of their self-imposed mothballs and into the 21st digital distribution century. We aren't quite ready to start holding our breath just yet, though.

Samsung SDI posts loss on weakening plasma prices

Up until the past few years, plasma screen technology dominated the 40"+ television market. Not only did plasma screens offer a larger picture than LCD's could produce, they were typically brighter, sported better contrast ratios, faster response times, and wider viewing angles than their LCD counterparts. Recently, however, LCD's have narrowed the gap significantly in many of these areas, and companies like Samsung SDI that specialize in producing plasma screens are feeling the pinch as a result. HangZhou Night Net

As ZDNet reports, Samsung SDI posted a net loss of $112.2 million for its July-September quarter, which actually beat the $125.4 million loss analysts predicted. Sales of plasma screens actually grew by 40 percent compared to the second quarter of this year, but the falling price of plasma screens has hurt Samsung SDI's efforts to re-enter the black. Unfortunately, it doesn't appear that things will get better any time in the near future, although SDI does expect plasma prices to stabilize in the fourth quarter. Plasma television sales have continued to grow, from 750,000 sold in 2005 to one million sold in 2006, but LCD television sales grew from 2.6 million to 5.2 million over the same time period.

Now that LCD's have moved into the large display segment, it's clear that plasma has some significant deficiencies compared to LCD. Although the average cost of a plasma television dropped dramaticatically this year, from $2480 to $1664, that's still nearly double the cost of an LCD television, which dropped from $989 to $932. Plasma screens are also tremendous power hogs compared to LCD screens, and can draw up to 400W depending on the size of the screen and what's being watched.

Samsung SDI and other plasma manufacturers are collectively working to boost the both the power and luminous efficiency of their displays, while simultaneously working towards bringing OLED screens to the mass market. Realistically, however, OLED technology is years away from mass deployment. Sony has stated it intends to introduce an OLED television next year with a screen somewhere between 11 and 27 inches—and a price tag of $800-$1000. Without being able to count on OLED technology as a near-term revenue source, Samsung SDI and other plasma manufacturers are going to have to focus their efforts on building a better class of plasma screen. If not, they risk being eliminated from the consumer market.

Effects of chewing gum and fast food on diet

This week, the annual meeting of the North American Association for the Study of Obesity is being held in New Orleans, Louisiana. A pair of studies presented on Monday examined a contributing factor in obesity and a simple way to help control appetite. The first study reported results from a series of national surveys carried out over the past three years that examined American's eating out habits. The second looked at using chewing gum as an afternoon appetite suppressant. HangZhou Night Net

Researchers at Temple University carried out an analysis of people's view towards healthy foods in restaurants. The study found that "Americans are less willing to pay more for healthy dishes, less knowledgeable about healthy menu items, and more likely to consider healthy items bland tasting." According to the study's lead author,Kelley E. Borradaile, "the results underscore the importance of competitively pricing healthy foods."

Data for this study came from a series of three national phone interviews carried out in 2004, 2005, and 2006. Each survey reached at least 4,000 adults aged 18 to 98. In the 2006 survey, Americans reported eating out five times a week. Fast food was the most common choice for breakfast and lunch; for dinner, casual dining and fast food were the most common options. The survey found that those who ate fast food three to six times a week had a BMI that was "significantly greater" than those who ate less than one or two fast food meals a week. It was also found that an additional one, two, or three fast food meals correlated with an increased body mass of 0.63, 1.26, 1.86 kg, respectively. As obesity becomes a bigger and bigger health concern in the US, the series of surveys found that "Americans were less likely to pay more for healthier foods, less knowledgeable about healthy menu items and more likely to consider healthy items bland-tasting in 2006 than in 2004," according to Borradaile. Even with the increased public discourse, people are less informed about the effects of food on their daily diet.

A separate study, carried out by researchers at Glasgow Caledonian University and theWrigley Science Institute, found that chewing gum can be a good appetite suppressant. The study found that by chewing gum before an afternoon snack, one would consume 25 less snack calories. While that is not a high number, according to nutritionists, even a slight reduction in caloric intake can have significant effects in the long term. This study was comprised of 60 adults between the ages of 18 and 54. Each participant consumed a sweet and salty snack after either chewing sweet gum or not chewing gum at all. Hunger, appetite, and cravings were then monitored throughout the remainder of the day. Along with reducing caloric intake, participants reported feeling an improved mood due to reduced anxiety and stress, and increasing contentment and relaxation.

TiVo Series 3 and HD get bonus features, multi-room viewing

Just weeks after announcing its partnership with Rhapsody to offer music services, TiVo today announced a host of new features, as well as an eSATA storage product, that will be available for Series 3 and TiVo HD boxes. HangZhou Night Net

We're most excited about the arrival of TiVo's Multi-Room Viewing (MRV) which will allow users to view their stored TiVo shows across separate TiVo devices in different rooms. For example, if you've stored an episode of Golden Girls in your living room, you can also watch it on the TiVo in your bedroom as you doze off. For TiVo fanatics, this is the best way to record all your favorite shows, even when you might have 3 or even 4 shows on at once (recording 2 per DVR, for instance). MRV support will extend across Series 2, Series 3, and TiVo HD boxes, although recordings must be sent from a Series 3/HD box to the Series 2, and not vice versa, and the Series 2 cannot handle HD content.

The Series 3 and HD boxes also now support TiVoToGo, which will allow any downloaded content to be sent across your network to a laptop or desktop computer, where it can be either viewed using Desktop Plus, or burned to a DVD. If you're using Mac OS, instead of using TiVo's Desktop software, you'll be able to use Roxio Toast 8 or Popcorn 3.

In a joint venture with Western Digital, TiVo has released the first
TiVo-certified external storage device, dubbed the "My DVR Expander."
It's a 500GB eSATA drive which looks just like the MyBook line
of external drives currently offered from WD, albeit with an orange
drive light. The My DVR extender will work with the TiVo HD and Series
3 DVRs and will be available from Best Buy or the TiVo store for
$199.99. eSATA support is nice, but the added box, power requirements, and heat have us thinking that hardcore users are better off hacking their own storage expansions or turning to a Weaknees TiVo.

Finally, TiVo also announced a progressive download feature, which will allow users to begin watching downloaded content from Amazon UnBox as it's downloading instead of having to wait until the download completes to begin watching the media. Keep in mind that this feature is completely reliant on the network connection's ability to download at a quick pace: if the download is too slow, the software may limit playback to avoid any hiccups.

We have a pair of Weaknees TiVos in the lab for testing, so we'll be testing out a bunch of the new functionalities, including the Multi-Room Viewing support, in the near future.

Floating, Texas-sized garbage patch threatens Pacific marine sanctuary

A looming environmental threat the size of Texas should be hard to miss, but when that threat is floating in a rarely-visited section of the Pacific Ocean and composed of a diffuse mass of plastic, it's easy for it to avoid public attention. The recent establishment of a marine preserve north of the Hawaiian Islands has refocused attention on this floating refuse heap, which has picked up the moniker the Great Pacific Garbage Patch. HangZhou Night Net

The technical name for this area is the North Pacific subtropical gyre. It is bounded on all sides by a clockwise flow of currents around the Pacific basin and tends to have a high-pressure system sitting over it for much of the year. The net result of these conditions is that material that drifts into this area tends to stay there, as this portion of ocean doesn't mix much with the surrounding currents.

Up until recent years, much of this material has been biodegradable; the arrival of plastics, however, has changed that. A survey (PDF) of the Great Pacific Garbage Patch that dates from 1999 suggested that, at the surface, plastic was present at five kilograms per square kilometer—that's nearly six times the plankton density in the same area. Most of the plastic was either thin films (such as trash and grocery bags) or monofilament line used in fishing.

Given that the Garbage Patch falls in a rarely-traveled area of ocean, the accumulation of plastic was unlikely to draw much public attention. But the region's obscurity actually helped it gain widespread attention. A chain of islands extending northwest from Hawaii that forms the Garbage Patch's western border have been largely untouched since their use as American bases during World War II. Last year, President Bush ordered that they remain undisturbed, creating the Papah?naumoku?kea Marine National Monument (the web site contains an MP3 of its proper pronunciation).

The sanctuary designation means that the government is now obliged to come up with a management plan and perform regular environmental assessments, both of which are likely to focus attention on those contents of the Garbage Patch that either wash ashore on these islands or interfere with the animal life within the preserve. Even prior to the reserve's formation, Congress had passed a law that directed NOAA and the US Coast Guard to begin tracking marine debris and participate in global efforts directed towards its reduction. Unfortunately, the money necessary for these agencies to implement the law did not appear in the subsequent budget.

Plastics are an essential part of modern life, so this problem is not going away in the near future. One possible way of cutting down on the accumulation of plastic there would be to shift to plastics with a shorter half-life in the environment. But until these plastics hit the market, the clearest way to prevent the Garbage Patch from growing and harming the United States' largest marine sanctuary is to prevent the plastic from getting there in the first place, either by limiting its use or aggressively recycling it.

Verizon to pay $1 million over deceptive “unlimited” EVDO plans

The case of the limited "unlimited" EVDO has been settled: Verizon has agreed to pay out $1 million to customers that it has terminated for overuse of its high-speed data service. New York Attorney General Andrew Cuomo made the announcement today, saying that Verizon's decision came after a nine-month investigation into the company's services and marketing practices. The attorney general accused Verizon of producing misleading materials and deceptive marketing when it claimed that its data plans were unlimited. HangZhou Night Net

The issue came to light last year, when some customers found their accounts on the chopping block after downloading too much data over Verizon's wireless broadband service. The wireless provider prominently advertised its EVDO service as "unlimited," but the fine print indicated that it was only unlimited for certain things, such as e-mail and web access. Video, music, and other media did not fall into that category, and Verizon began enforcing an undisclosed bandwidth cap on users that the company decided downloaded too much.

Verizon eventually cut off some 13,000 customers for excessive use of its "unlimited" service, leaving those customers out in the cold with equipment that they could no longer use. But the Attorney General said that the service's limitations were not clearly and conspicuously disclosed, and that they "directly contradicted the promise of 'unlimited' service."

Verizon, which has voluntarily cooperated with the investigation since it began in April, has now agreed to reimburse the terminated customers for the costs associated with their now-useless equipment. Verizon estimates that it will come out to be about $1 million, with an additional $150,000 in fines paid out to the state of New York.

"This settlement sends a message to companies large and small answering the growing consumer demand for wireless services. When consumers are promised an 'unlimited' service, they do not expect the promise to be broken by hidden limitations," said Cuomo in a statement. "Consumers must be treated fairly and honestly. Delivering a product is simply not enough—the promises must be delivered as well."

The company has also agreed to revise its marketing of the wireless plans, in addition to allowing common uses of the broadband connection (such as video downloads). As you can see above, Verizon has already updated some of its marketing material.

Let your geek flag fly: a review of Eye of Judgment

A cold winter morning, in the time before the light

Eye of Judgment
Publisher: Sony Computer Entertainment
Developer: SCE Studios Japan
Platform: PlayStation 3
Price: $69.99
Rating: Teen HangZhou Night Net

A few days ago I had an experience of such intense geekiness that I felt like I was back in high school, in a garage somewhere, listening to a Led Zeppelin tape and playing Dungeon and Dragons. What caused this intense and oddly freeing embrace of the best parts of being a geek? I sat down with a good friend, set up Eye of Judgment, and played for hours upon hours. The cheap pot may have been traded in for a nice bourbon, but the overall feeling was the same.

Eye of Judgment is a game that doesn't just require you to be a geek, it demands that you give yourself over to the sheer insanity of the experience. You need a PlayStation 3, a nice television, the PlayStation Eye, a bunch of cards, a table near the PS3 so you can hook up the camera and lay down the mat, and, in an optimal situation, a friend sitting across from you.

When you're holding your deck of cards, staring into the eyes of your opponent before laying down that Biolith God that will lay waste to his entire force, you'll know that not only are you playing something that embraces many clichés of games you've played before while doing something completely new, you may have just reconfigured your entire room to do it.

What the players see on the left, what the television displays on the right

I lay my card down on the mat. The camera scans it, and on the television runes explode from the card's face before the deity emerges and destroys two of the four creatures my friend has laid down. He removes the "dead" cards from the mat, places them in his graveyard where the vanquished rot, and takes a sip of his drink. Only then do I realize I just blew all my mana, and he's been banking his for an equally impressive attack. He lays his card down, and our eyes turn to the screen to see what carnage his attacks will cause. Heavy metal music is playing is the background.

This is awesome. This is Eye of Judgment.

My Judgment. Let me show you it

Eye of Judgment is a little bit Magic: The Gathering and a little bit of the chess game from Star Wars. The new webcam from Sony looks down on the game mat, which features a 3×3 grid of squares. You lay down cards on the squares, summoning creatures to allow you possession of each slot. When you own five spots, you win. Sounds easy, doesn't it?

While you lay down cards on the mat, the actual play area may not look like much, but on your television you'll see a creature on top of each card, you'll see information on the hit points that each creature has left, as well as their attack strength and which direction they're facing. Each time you turn a creature, attack, or summon a new monster you use mana. You gain two mana points each round, and there are cards that allow you to gain more mana points by discarding creatures in your hand. Management of yourmana points is a major piece of strategy: do you summon a bunch of weak creatures early in the game or save your points to lay down the serious ordnance?

There is a spell card that costs no mana to cast and simply spins a character around. Since creatures can only attack in certain directions and spinning this particular creature cost six mana for every 90-degree turn, I effectively wiped out my opponent's mana pool in one swipe. The strategy is deep; you'll be learning new tricks and getting ideas for how to play your cards for a long time.

Give your living room to Eye of Judgment

Learning how to play and what the numbers and information on the cards mean takes a little bit of a time investment, although the printed manual and video-based tutorials make the process easier. Just a warning, the video tutorials aren't interactive, and they're incredibly boring. They actually explain to you that when someone brings cards over, they are the "owner" of those cards. I'm surprised they didn't explain that the cards were made of "paper" which is made from "pulped trees."

The game itself is not overly complicated. You can explain things and play a quick game in about an hour, and after that, players catch on rather quickly to the subtleties of strategy. Eye of Judgment could have been a complicated mess, but I'm happy to report the rules make sense and are easy to comprehend while not limiting play. Each battle is a series of questions. Doyou use mana now or later? Each square matches with an element, so ifyou play a water-based card on a water square you get a bump. Do you play a card now or wait for a better elemental slot to become available?

These are questions that will haunt you at night as you listen to Dragonforce and plan your next deck.

I've got a BA in mana management

How well the card game plays is a worthless thing to try to judge if the hardware doesn't work; if the camera has issues with the cards, then the whole set up is a waste. It can take a minute or two to calibrate the camera and to make sure the mat is lined up correctly, but the tools built into the game do a great job of making this painless. You'll also want to make sure the room you play in has plenty of light. After these two things are done, the camera does a fantastic job of seeing the cards and reacting quickly to your play.

This is made even more impressive by the fact that cards in play aren't the only things the camera has to keep track of. To attack or end your turn, you have to show the camera "action" cards to tell the game what you'd like to do, and this process is quick and easy. Sometimes the camera does lose track of the card, but when this happens you get a clear warning, and a slight adjustment is usually all that is necessary to make things right again.

We did bump into troubles once or twice with cards that the camera refused to recognize, but this was a very rare occurrence. Overall, the PlayStation Eye camera was more than up to the task of keeping play quick and fun.

My one complaint—and this will only become an issue in future games—is that even with good lighting the actual image the PlayStation Eye picks up is pretty muddy and indistinct. This won’t bother you in Eye of Judgment, where the screen is covered with overlays and you will only see your hand on the screen when you lay cards, but it doesn’t bode well for later games that will feature more video interaction. The picture is better than the last-generation Eye Toy, but the jump isn’t as large as I had been hoping for.

The camera stand breaks down quickly and easily for storage, and set up only takes a minute or so. You may want to flatten the mat when you first unpack it, as the little ridges from being packed are slightly annoying while laying down cards.

Comcast shooting itself in the foot with traffic shaping “explanations”

As the evidence that Comcast is doing something untoward with BitTorrent and other traffic on its network has mounted, the cable company has tried clumsily to fend off accusations of wrongdoing. The latest developments come in the wake of several conference calls held by the ISP in which it attempted to make a case for its practice of sending forged TCP reset packets to interfere with some P2P traffic. HangZhou Night Net

Timothy B. Lee, who is a regular contributor to the Tech Liberation Front blog as well Ars Technica, was invited to sit in on one of yesterday's conference calls, along with folks from a handful of think tanks. According to Tim, the Comcast engineer on the call said that the Lotus Notes problems were a known side effect of Comcast's traffic shaping practices, one the company was trying to fix. The engineer also "seemed to implicitly" concede that the accounts about the forged packet resets were accurate.

Delaying as a blocking tactic

The company still claims that it is isn't blocking BitTorrent and other P2P traffic, just "delaying it." In a statement given to Ars earlier today, a Comcast spokesperson denied that the company blocks traffic. "Comcast does not block access to any Web sites or online applications, including peer-to-peer activity like BitTorrent," the spokeperson told Ars. "Our customers use the Internet for downloading and uploading files, watching movies and videos, streaming music, sharing digital photos, accessing numerous peer-to-peer sites, VOIP applications like Vonage, and thousands of other applications online. We have a responsibility to provide all of our customers with a good Internet experience and we use the latest technologies to manage our network so that they can continue to enjoy these applications."

Comcast VP of operations and technical support Mitch Bowling put it this way. "We use the latest technologies to manage our network so that our customers continue to enjoy these applications. We do this because we feel it's our responsibility to provide all of our customers with a good Internet experience."

Another Comcast executive told the New York Times that the company "occasionally" delays P2P traffic, "postponing" it in some cases. His rather clumsy analogy was that of getting a busy signal when making a phone call and eventually getting through after several attempts. "It will get there eventually," is the takeaway message.

That's a distinction without any meaning. If someone is preventing my calls from going through and giving me a busy signal, the effect is the same. At the time I am trying to make the call, it's being actively blocked; calling it "delayed" is merely an exercise in semantics. Comcast is, in effect, impersonating the busy signal and preventing the phone at the other end from ringing by issuing TCP reset packets to both ends of a connection.

What's particularly troublesome is that Comcast's FAQ leaves customers with the impression that all content will flow unfettered through its network. One entry states that Comcast engages in "no discrimination based on the type of content," saying that the ISP offers "unfettered access to all the content, services, and applications" on the Internet. Another FAQ entry informs customers that Comcast does not "block access to any Web site or applications, including BitTorrent."

What did I do wrong?

Comcast's attempts to clarify its traffic shaping practices are having the opposite effect of what the company intends. As is the case with its nebulous bandwidth caps, customers can find themselves running afoul of what appears to be an arbitrary limitation imposed by the ISP. As a result, Comcast's customers don't really know that what they're paying for, aside from a fast connection that may or may not give them access to the web sites and applications they want. The company's public comments on the traffic shaping issue are intended to leave the impression that, like the bandwidth cap issue, this only affects a handful of bandwidth hogs. But judging by the comments we've seen from our readers and on other sites, there are either a lot more bandwidth hogs than Comcast leads us to believe, or the company's traffic shaping practices extend further than is being disclosed. Without some transparency from the ISP, we're left to guess.

Comcast has a handful of options to choose from. The company can own up to what it's doing and tell customers how to avoid running afoul of its BitTorrent regulations. Comcast could also continue on its current course, keeping its opaque traffic management practices in place. The cable giant's best option may be dropping the practice of sending false TCP reset packets altogether.

There are a couple of reasons that the third option may be the best choice for Comcast. First, it may be against the law. An Indiana University PhD student and Cnet contributor believes that the illicit reset packets may violate state laws in Alabama, Connecticut, and New York against impersonating another person "with intent to obtain a benefit or to injure or defraud another" (language from the New York law). In sending out the spoofed packets, Comcast is impersonating the parties at either end of the connection.

When the market can't sort things out

Legal concerns aside, Comcast is providing net neutrality advocates with plenty of ammunition. Comcast is not running a neutral network right now, and its traffic shaping choices are degrading the broadband service of many a Comcast customer.

In a perfect free market, customers would be free to pack up in leave Comcast for greener and more open broadband pastures, but the competitive landscape in the US doesn't always provide that kind of choice. More than a few Comcast customers are faced with the choice of Comcast or dial-up, leaving them with the Hobson's choice of hoping their data packets can evade Comcast's traffic shaping police or not having broadband service at all.

KDE development platform appears

HangZhou Night Net

This Wednesday in KDE land sees the start of the Release Freeze for KDE 4.0, anticipating a final release in December. By then, KDE 4 will have been in development for two years and five months, counting from the aKademy conference in Malaga, Spain, in July 2005.

This freeze is significant for KDE development as evidence of an increasing professionalization in the KDE release process. Where past major releases like KDE 2 (2000) and KDE 3 (2002) involved a relatively close-knit group of insiders releasing a set of source tarballs for Linux distributions to package, the KDE landscape in 2007 is broader and much more diverse. Seventeen modules made up KDE 3.0, and that contained pretty much the entire desktop and all the software that most users would run on it.

During the lifetime of KDE 3, development has become decentralized and some of the most popular applications, such as Amarok, Kopete and KDevelop emerged. They formed their own developer communities who may have little overlap with the core group of KDE library and platform hackers.

In addition, groups of businesses such as the Kolab Konsortium (we kid you not, those Ks are authentic German) have sprung up to serve the needs of companies using KDE. Their ‘Enterprise’ Branch’ flavor of the Kontact PIM suite, featuring additional enterprise features and QA work, is released on its own schedule, independently of the core KDE modules. Completing the scene, a swarm of smaller projects and individual developers take advantage of the KDE libraries, producing thousands of applications, as seen at www.kde-apps.org.

This expansion has caused major changes in the KDE release process. The era of the lone release manager (represented in the past by David Faure and Stephan Kulow) bearing responsibility for getting every module ready just in time for distributions’ release dates has ended, and instead a Release Team representing KDE’s various stakeholders oversees the readiness of all its components.

Also, the release itself has become staggered. To give all the projects and developers who rely on the KDE libraries time to adapt to the changes of a major release, these core libraries and runtime components are being frozen ahead of the rest of the desktop and its applications.

On October 30, the KDE Development Platform version 4.0.0 will be released. This forms a stable set of packages including the libraries, the PIM libraries, and runtime components of the base desktop such as the help browser, IO slaves, and the notification daemon that third party developers require to port their apps to KDE 4. They will be able to download this platform as binary packages for several distributions, easing the porting process.

KDE is expanding the scope of its ambition with this release, by targeting developers of scripting languages as well as its traditional C++ constituency. By shipping bindings for Python and Ruby as part of the Development Platform, the KDE project hopes to signal that these languages are fully accepted for KDE application development. Bindings for C# are due to follow shortly afterwards. And for the first time, KDE will be released for Mac OS X and Windows.

Targeting two new operating systems poses a challenge to the community’s resources. A brief poll of developers showed that although ‘most things work’, the KDE Development Platform version won’t be ready for these platforms at the same time as the Linux flavor. It is hoped that in the long term, this investment will pay off by bringing more participants to Free Software.

The significance of this milestone to KDE fans and users is that the rest of the applications making up the core KDE Desktop are now soft-frozen. While third party developers are just getting to grips with KDE 4, the applications that form the central KDE experience need to get in shape for the KDE 4.0 release. With a lot of issues facing KDE hackers before 4.0 is a usable desktop, all work on new features and UI is stopped, and efforts focus on fixing the inevitable, long list of bugs. A state of the user desktop is the topic of an upcoming article.

Hands on with Google’s OCRopus open-source scanning software

The first official alpha version of Google's OCRopus scanning software for Linux was released yesterday. OCRopus is built on top of HP's venerable open-source Tesseract optical character recognition (OCR) engine and is distributed under the Apache License 2.0. HangZhou Night Net

OCRopus uses Tesseract for character recognition but has its own layout analysis system that is optimized for accuracy. The OpenFST library is used for language modeling, but it still has some performance issues. OCRopus is designed to be modular, so that other character recognition and language modeling components can be used to eventually add support for non-Latin languages. An embedded Lua interpreter is used for scripting and configuration. The developers chose Lua rather than Python because Lua is slimmer and easier to embed. This release also includes some new image cleanup and de-skewing code.

We tested OCRopus with several kinds of content, including scanned documents and screenshots of text. The accuracy of the output varies considerably depending on the quality of the input data, but OCRopus was able to provide readable output in about half of our tests. Several of our test documents caused segmentation faults; this seems to occur when there is extra page noise or the letters aren't consistent.

We observed several common errors. For instance, the letter "e" is often interpreted as a "c" and the letter "o" is often interpreted as a "0" in scanned documents. OCRopus provides better results when scanning text that is printed with a sans serif font, andthe size of the font also has a significant effect on accuracy.

The following examplesshow the typical output quality of OCRopus:

Tpo' much is takgn, much abjdegi qngi tlpugh we arg not pow Wat strength whipl} in old days Moved earth and heaven; that which we are, We are; QpeAequal_tgmper of hqoic hgarts, E/[ade Qeak by Eirpe ang fqte, lgut strong will To strive, to Seek, to hnd, and not to y{eld.

Tho' much is taken, much abides; and though We are not now that strength which in old days Moved earth and heaven; that which we are, we are; One equal temper of heroic hearts, Made weak by time and fate, but strong in will To strive, to seek, to find, and not to yield

The installation process is relatively straightforward for experienced Linux users, but there were a few bumps. I tested OCRopus on my Ubuntu 7.10 desktop computer. The only dependency that isn't in the Ubuntu repositories is Tesseract, which I had to download separately and compile. Since I already hadmany development packages installed, the only dependencies I needed (apart from Tesseract) were libtiff-dev and libaspell-dev. If you don't have the aspell dependency installed, OCRopus will still build, but it will trigger an error saying that the word list is unreadable.

I also ran into a snag with Tesseract, which apparently only comes with placeholders for the language data files, causing OCRopus to emit an error saying that the unicharset file is unreadable. In order to resolve that problem, I had to download the English language data files and decompress them into /usr/local/share/tessdata.

According to the release announcement, a beta release is planned for the first quarter of 2008. The focus for the beta will be on better performance and output quality, whereas the focus of this release was functionality.

Google's involvement in the project is motivated by the company's interest in digitizing printed documents. Open-source OCR technology could be valuable in many other contexts as well. Government agencies that want to digitize paper records, for instance, could one day benefit from OCRopus. Although OCRopus is weak in many areas, it has some real potential.