Russian crackers throw GPU power at passwords

Russian-based cracking "password recovery" company Elcomsoft hasn't really been in the news since 2003, when Adobe helped make "Free Dmitry" the new "Free Kevin" by having one of the company's programmers, Dmitry Sklyarov, arrested for cracking its eBook Reader software. But Elcomsoft has remedied the lack of press attention this week with its announcement that it has pressed the GPU into the service of password cracking. HangZhou Night Net

With NVIDIA and AMD/ATI working overtime to raise the GPU's profile as a math coprocessor for computationally intensive, data-parallel computing problems, it was inevitable that someone would make an announcement that they had succeeded in using the GPU to speed up the password-cracking process. Notice that I said "make an announcement," because I'm sure various government entities domestic and foreign have been working on this from the moment AMD made its "close-to-metal" (CTM) package available for download. The Elcomsoft guys didn't use CTM, though. They opted to go with NVIDIA's higher-level CUDA interface, a move that no doubt cut their development time significantly.

Elcomsoft's new password cracker attacks the NTLM hashing that Windows uses with a brute force method. The company claims that its GPU-powered attack speeds up the time it takes to crack a Vista password from two months to a little over three days.

Elcomsoft claims that they've filed for a US patent on this approach, but it's not clear what exactly they're attempting to patent. A search of the USPTO's patent database turned up nothing, but that could be because the patent hasn't made it into the database yet.

Ultimately, using GPUs to crack passwords is kid's stuff. The world's best password cracker is probably the Storm Worm, assuming that its owners are using it for this. As many as ten million networked Windows boxes—now that's parallelism.

Climate change mega-post

This week there seems to be a lot of climate news around, some good, some bad, and some that is just ugly. Rather than putting up a plethora of posts and getting accused of being Ars Climactica, we thought we would combine them into a single mega post for your consumption. HangZhou Night Net

The first paper, published in Science1, looks at the prospects for narrowing the range of estimates for the future climate. In doing so, they note that the climate is a system that consists of many physical processes that are coupled together nonlinearly. This has led to climate modelers focusing on physical mechanisms and fundamentals of nonlinear dynamics to understand and improve their models. Notably, the specific inclusion of many physical mechanisms has not led to a significant decrease in the range of climate predictions. Most of the blame for this has fallen on the nature of nonlinear systems. Essentially, to obtain a small increase in predictive ability, one needs a very large increase in the accuracy of the initial conditions. We are stuck because we can’t improve the accuracy of our ancestor’s weather stations and other methods, such as ice core samples, will only ever yield averages. But as our earlier coverage on the nature of climate modeling explains, this isn’t really the heart of the issue. Climate models use a range of initial conditions and measure the probability of certain climatic conditions occurring based on those modeling results.

Instead of focusing on the physics of the climate or the dynamical system, Roe and Baker look at the behavior of a simple linear equilibrium system with positive feedback. All the physics is replaced with a simple gain parameter, which describes how an increase in average temperature leads to a further increase in temperature. Although this does not describe the physics, it does encompass what we measure, so the model is valid for their purposes. They then explore how the uncertainty in the gain parameter changes the rate of temperature increase. The positive feedback system has the effect of amplifying the uncertainties (just like a nonlinear system), meaning that it is practically impossible to improve climate estimates. This is not really derived from the initial conditions (e.g., the starting climatic conditions) but rather focuses on the natural uncertainty in physical mechanisms, which is a major focus of current modeling efforts and includes such things as cloud cover. Basically, the amplifying of the uncertainties, and the timescales involved mean that the smallest uncertainties blow out to give the large range of temperatures predicted by climate researchers.

This news will not call off the search for parts of the environment that influence our climate because, if we are to mitigate global warming, then we must know which bits of the environment are the best to change. This obviously includes human behavior, but that covers a whole gamut from urban lifestyles through to farming practices. Part of this picture is soil erosion, which removes carbon from the soil and deposits it elsewhere. The question isn’t so much as where but what happens to that carbon on route and once it arrives. It was thought that perhaps soil erosion contributed carbon dioxide to the atmosphere by opening up new mechanisms for the decomposition of organic matter. Alternatively, it has been argued that soil erosion deposits organic carbon in places—like the bottom of the sea, for instance— where it is effectively stored. However, testing these hypotheses has been problematic.

Nevertheless, problematic is what a good scientist looks for, so, with fortitude and dedication to the cause, scientists from the EU and US have collaborated to measure the uptake and removal of carbon over ten sites. They report in Science2 this week that, like normal land, eroding land also acts as a carbon sink. They do note that in eroding landscapes, the carbon is likely to more laterally more, but is no more likely to enter the atmosphere as carbon dioxide than on healthy pastureland. Of course the amount of carbon stored is slightly less, so these soils are perhaps not as efficient as normal soils as carbon sinks. Some research is needed to determine if there are differences in the long-term destination of carbon between normal pasture and eroding soils—however, until that research is done, we can cross soil erosion off the list of things to worry about in terms of global warming.

On the bad news, rapid industrialization in the developing world and the lack of action in the developed world is now measurably increasing the rate at which we deposit carbon dioxide into the atmosphere. This is the conclusion of a paper to be published in the Proceedings of the National Academy of Science. Essentially, they have looked at estimates for anthropogenic carbon dioxide emissions and compared that to the measured concentration in the atmosphere and determined from the time series that the natural carbon sinks are either already saturated or are nearing saturated. The conclusion from this is that the concentration of carbon dioxide in the atmosphere is likely to increase faster than predicted in most scenarios. This is especially true since most scenarios assume that we will take some action to keep the rate of increase in atmospheric carbon dioxide (as a percentage) below the rate of economic growth (also as a percentage). Not the best news.

Electronic Arts to undergo empire-wide restructuring, layoffs

When you're on top, the only place to go is down. In the face of stiff competition, EA's profits have begun to drop. Destructoid is reporting that job cuts and branch restructuring have already begun taking place, with extensive changes being made to many different studios under EA's umbrella, including Mythic. HangZhou Night Net

Word of these changes came from an internal EA e-mail. CEO John Riccitiello has begun taking precautions to ensure that the current state of affairs of his company doesn't continue. This follows a previous restructuring meant to rebalance staff across the many branches of the company. To quote the e-mail:

Given this, John Riccitiello, our CEO, has tasked the company to get its costs in line with revenues… Every studio, group and division of the company has been tasked to review its overall headcount and adjust its organization to meet the needs of the business moving forward.

The changes to Mythic appear to be only the first in what will be a long line of changes. Certain teams, such as the Ultima Online group, will be relocated. Competitive employment strategies will also be enforced to keep employees working hard if they want to keep their jobs: "attrition, performance management, stricter hiring guidelines, and layoffs" will purportedly keep workers in check.

Given the state of EA's multiplatform competitors, including Activision, which is set to release one of the assured hits of the winter in Call of Duty 4, and long-time rival Ubisoft, which is sitting on Assassin's Creed, the company will be pressed to start taking more risks like skate if it hopes to stay fresh in this increasingly competitive development scene.

Gmail delivers a knockout punch: IMAP changes the “freemail” game

Not everyone will appreciate it, but Google just upped the ante in the webmail game by rolling out unprecedented free IMAP access for Gmail. Beginning last night, Google began activating IMAP access to all Gmail accounts, and as of this morning plenty (but not all) users are reporting that IMAP access to Gmail is now possible. IMAP isn't a new technology, and it's certainly not loved universally. It does make Gmail that much more accessible on a variety of devices and from multiple locations, however, and it puts Microsoft and Yahoo in the position of needing to play catch-up. HangZhou Night Net

Gmail has allowed access via the web interface or POP access for quite some time now. POP allows e-mail clients to download messages from the server, but doesn't reflect any changes on the server once the messages are manipulated on the client side. So if you download five messages, read four of them, and move three of them to other folders on your desktop e-mail client, those messages will remain unread and unmoved on the Gmail server. When you check the server again from a different device, you have to go through the whole process all over again with the same messages.

Such is not the case with IMAP—any changes you make on the client side are synced back with the server (when a connection is available), so that read items remain read and moved items remain moved on all devices checking that account. In other words, IMAP treats remote folders as if they were local, which is great if you use more than one interface for accessing and organizing your email (say, webmail from work, your iPhone on the road, and a mail client like Thunderbird at home).

Google has a help page up that explains the differences between POP and IMAP, along with instructions on how to set up the latter on your account. IMAP isn't pushmail, and it isn't known for being lightening fast. IMAP is the best widely-supported protocol for multiple access point usage, however.

IMAP not for the weak (servers)

IMAP has been slow to come to free e-mail services for a variety of reasons. In addition to the fact that most users of free email services have been happy with webmail clients and therefore don't care about IMAP, it generally requires more resources per active user on the server side. Because IMAP typically maintains connections between client and server, it requires more bandwidth and processing power to maintain per user. By enabling IMAP, Google is flexing its muscles.

IMAP also encourages users to store messages on the server over the long term—something that POP users do as well, but perhaps not as often or in such high volume, and certainly not in remote folders. IMAP access is thus enabling another way to get at the massive storage capacity offered by Gmail.

Gmail Product Manager Keith Coleman has another theory on why webmail services haven't made IMAP widely available, noting that most (including Google) are at least somewhat dependent upon advertising revenue from their web-based clients. "We thought that was a trend worth breaking," he told Ars. "Initial reaction has been great so far."

While this tidbit of news may not mean anything to casual Gmail users, it's a major feature addition for an otherwise free, publicly-available e-mail service. Not only does it come as a welcome addition to those who make heavy use of Gmail, it will also make a difference to businesses and institutions that use Gmail as part of Google Apps for your Domain. Lots of business users check their e-mail on various handheld devices in addition to a computer or two, so the addition of IMAP will make Google Apps for your Domain a much more attractive service for companies looking for comprehensive, yet easy, e-mail solutions.

While Yahoo and Hotmail have more or less caught up with Google when it comes to offering mass amounts of storage capacity, the addition of IMAP to Gmail will make a big difference in free e-mail services in the future—it likely won't be long before we see IMAP capabilities added to Yahoo and Hotmail, and perhaps even a smattering of smaller e-mail services vying for attention.

As mentioned at the outset, IMAP isn't universally loved. It can be slow, especially over wireless mobile data networks that barely escape dial-up speeds. Yet more access options can't hurt, and judging from reader reports we've received, many of you with mobile devices that support IMAP are already itching for Google to activate your account. If you try it out, let us know your experience in the discussion. We're currently finding the iPhone support to be excellent (over WiFi).

Let open access reign: Verizon relents on legal challenge to FCC

In a surprise move, Verizon has dropped its lawsuit challenging the open access requirement for next January's 700MHz spectrum auction. Earlier this week, the telecom filed a motion to voluntarily dismiss the case filed last month with the Court of Appeals for the DC Circuit. The motion for dismissal cites the court's decision to deny Verizon's request for an expedited review as the reason for dropping the case, as an unexpedited review is unlikely to be completed prior to the start of the auction. HangZhou Night Net

The rules laid down by the Federal Communications Commission require the high bidder for the "C" band—a highly-desirable 22MHz chunk of spectrum—to allow any lawful device and any lawful application to use the spectrum. As a result, Verizon's current cellular model, which imposes restrictions on the devices and types of content that can be used on the network, won't fly.

When it sued the FCC to have the open access rules overturned, Verizon called the rules "arbitrary" and "capricious," saying that the mandate was "unsupported by substantial evidence and otherwise contrary to law." Verizon's challenge drew criticism not only from advocates of open access, but from some other companies that are expected to bid on the spectrum. Google was especially critical of Verizon, saying that the company didn't believe "consumers deserved more choice" than they currently have.

Other companies have voiced objections to the FCC's auction rules as well. AT&T has asked for clarification on the requirements for Block "D," a chunk of spectrum that would be used for both commercial wireless broadband and public safety access. Frontline, which wanted to run a public/private wireless network using the spectrum, has argued that the FCC's $1.6 billion reserve price for Block D is too high and that the auction rules would make it too easy for a big incumbent like AT&T or Verizon to snap up enough spectrum to result in "unacceptable anticompetitive effects."

Frontline has also attempted to have Verizon barred from participating in the auction for violating the FCC's lobbying rules, citing a September 17 meeting between Verizon, FCC Chairman Kevin Martin and some other FCC staffers. Under the FCC's rules, companies are supposed to submit ex parte filings disclosing the nature of these meetings; Frontline called Verizon's single-sentence filing an "arrogant violation" of the rules.

Needless to say, the FCC hasn't barred anyone from participating in the auction, and, earlier this month, the FCC released the final set of rules for the spectrum sell-off. Aside from bumping the auction back eight days to January 24, 2008, there were no changes of note. All that's left is to let the bidding begin and see which of the would-be bidders (including AT&T, Google, Verizon, and Frontline) step up to the auction block.

XBLA Wednesday: Exit the soul of Battlestar Galactica

The streak of double-header Xbox Live Arcade Wednesdays continues this week with the release of two new titles, Exit and Battlestar Galactica. Both are now available and can had at 800 Microsoft Points ($10) each. HangZhou Night Net

Exit is one of the lesser-known, though quite entertaining, PSP titles, and the Xbox Live Arcade port lives up to the off-beat puzzle action of the original. As Mr. ESC, you'll need to move around burning buildings and traps and save the helpless citizens by leading them out of their terrible circumstances while overcoming various puzzles, as inAbe's Odyssey. While the original PSP title was released with only 100 levels, the XBLA touts 220 levels, online leaderboards, and future downloadable content. That's quite an upgrade for $10.

Unfortunately, while the new content adds value and the HD graphics look great, there are still some nagging issues that haunted the PSP version. Originally criticized for some spotty context-sensitive jumping controls, the XBLA version of Exit still suffers from some weird control nuances. These don't wreck the experience, butthey area blemish on an otherwise strong title.

Given how well Wing Commander Arena went over, I can't imagine this week's other release, Battlestar Galactica, will do very well. Though the TV series is popular for its drama and political intrigue (moreso than its combat), the game forgoes story for the sake of top-down space shooting action. Those hoping for some well-integrated use of the Battlestar lore are in for a disappointment. While there may be some familiar sights in the form of backdrops and vehicles, the game does little to maintain the frantic, humans-against-the-universe feel that the series is so well known for. If you can look past that, there's a serviceable albeit somewhat slow-paced top-down, free-roaming space shooter to find—not exactly what fans will want as they await the fourth and final season.

Exit is definitely the choice for the week. If you haven't played the title yet, do yourself a favor and pick it up. Even with the control problems, it's still a lot of fun to play.

Game Review: Rockstar Games Table Tennis (Wii)

This is another port for the Wii that we've basically reviewed before in its Xbox 360 incarnation, so now that Table Tennis is out for the Wii the only thing that needs to be talked about is the control scheme. Everything else, sans online play, is the same as the Xbox 360 version. Sure, the graphics are downgraded, but almost everything you need to know about the game can be found in our big review of the original title. HangZhou Night Net

A Table Tennis game seems like a natural fit for the Wii, but other people have gotten Wii controls wrong before. Luckily, Table Tennis hit this one over the net. There are nice tutorials that allow you to practice each technique, and you'll need them; it takes some practice to learn how you're supposed to move the Wiimote to aim the ball correctly. The game does not feature 1:1 movement, you basically swing in different directions to aim the ball. If you want the ball to go to the upper left hand corner, you swing up and to the left. Close to the net on the right hand side? Swing down and to the right. I had problems when I tried to control the game with large, sweeping movements, but once I started to hold the Wiimote like a Table Tennis paddle while making tighter movements it all became clear. Give yourself a few minutes to get used to the controls, and you'll see that while they're not instantly intuitive, they do work very well.

Holding the Wiimote in front of you to make these short swings sounds easy, but with volleys getting longer and longer as you play don't be surprised if you break a sweat. With the Wiimote-only control scheme you add spin with the d-pad, and the computer moves your player for you. If you add a nunchuck you can move your character yourself, but that gets hard to keep track of very quickly. In a third control setup you can use the nunchuk to add fine control of where your ball lands, but that's only for people with insanely talented hands. These are the three control methods, and while I prefer the default Wiimote-only controls, the other two with the nunchuk can be fun if only to test yourself.

There is no online play (boo!), but playing with another person in the room is a great time as you knock the ball back and forth. It feels oddly like an actual game of Table Tennis, and things can get intense very quickly.

At $40, this is a little less expensive than your average game. So if you're looking for something a little more in-depth than Wii Tennis, thenthis is a great buy. And it isn't a bad workout, either. It's good to turn on the Wii again.

Status: Buy
Price: $39.99
System: Nintendo Wii
Developer: Rockstar Games
Publisher: Take 2
ESRB Ratings: Everyone
Other recent reviews:

Mercury Meltdown RevolutionFolkloreFinal Fantasy Tactics: The War of the LionsMySimsEternal Sonata

Cornucopia of Leopard t-shirts available for launch

With Leopard dropping this week, there is no shortage of new Apple-themed apparel going on sale or being given away for free. Apple will, of course, be giving away its own shirts to the first 500 visitors in each of its stores Friday afternoon, but the thread doesn't stop there. HangZhou Night Net

FastMac, purveyors of all manner of Mac upgrades, has announced that it will be giving away four new Leopard-themed t-shirts outside various Apple Stores on Friday from 4 to 6 pm. Three of the shirt designs are featured and available for order at a deep discount of just $7.99 at FastMac's site. The designs include geeky cultural puns like "Hasta la Vista," "Mac to the Future," "A whole new Xperience," and "300 – Madness? This is LEOPARD!," though the latter is oddly not pictured or available for order yet.

Next up is MacMerc with a couple of shirts, the first of which is called "Top Secret." This is a simple design that hones in on Leopard's original shroud of secrecy. MacMerc's second Leopard-themed t-shirt, called "Mac OS X Leopard: Time Machine," also riffs off the Back to the Future theme.

Last, but probably not least, is a set of Apple, Mac, and general tech-themed shirts that you need to vote into existence. Insanely Great Tees is back with another five shirts that it wants hopeful owners to weigh in on. The shirts that get the most clicks will go to print, and choices range from an illustrated "Any" key to a graphical timeline of Apple's first 30 product years. There are even anthropomorphized i-gadgets playing music.

With Apple's popularity showing no signs of slowing down lately, these probably won't be the only ways for Apple fans to don new threads to express their inner geeks. Who knows, maybe even Think Geek will unveil an Apple-centric shirt or two in the coming months.

Teachers’ lack of fair use education hinders learning, sets bad example

Here's how bad it is: not a single teacher interviewed for a recent study on copyright reported receiving any training on fair use. HangZhou Night Net

Copyright confusion is running rampant in American schools, and not just among the students. The teachers don't know what the hell is going on, either, and media literacy is now being "compromised by unnecessary copyright restrictions and lack of understanding about copyright law."

That's the conclusion of a new report from the Center for Social Media at American University. Researchers wanted to know if confusion over using copyrighted material in the classroom was affecting teachers' attempts to train students to be critical of media. The answer was an unequivocal "yes."

One teacher, for example, has his students create mashups that mix pop music and news clips to comment on the world around them. Unfortunately for the students, the school "doesn't show them on the school's closed-circuit TV system" because "it might be a copyright violation."

One big problem is that few teachers understand copyright law; they follow guidelines drawn up by school media departments or district lawyers, or they rely on books that attempt to lay down principles appropriate for an educational setting. As the report notes, though, this advice is generally of the most conservative kind, while long-established principles of fair use may afford far more rights—especially in a face-to-face educational setting.

Researchers found that teachers may not understand the law (or may understand it to be unduly restrictive), but that they deal with their confusion in three different ways. Teachers can "see no evil" by refusing to even educate themselves about copyright, on the thinking that it can't be wrong if they don't know it's wrong. Others simply "close the door" and do whatever they want within the classroom, while a third group attempts to "hyper-comply" with the law (or what they perceive the law to be).

The results can be less-effective teaching tools. One teacher profiled in the survey wanted to promote literacy among kids who might not be enthused about it, and he thought that using lyrics from the Beatles and Kanye West might be a good way to do it. The license holders wanted $3,000. The report's authors claim that a robust understanding of fair use would give educators far more confidence about using such materials in the classroom.

Because teachers aren't confident in the rules and have no training in fair use, many rely on rules of thumb with no real basis in the law. One teacher, for instance, told her students, "If you have to pay to use or see it, you shouldn't use it," though uses of such works for commentary, criticism, and parody are explicitly established by US copyright law. The result is students that are even less-informed about copyright law.

Creating a new "code of practice" for educators could go some way toward fixing the situation, especially if such a code were blessed by major library and teachers' associations.

But the basic issue is the fear of lawsuits that could cost a school district tens of thousands of dollars. Because the four fair use principles are intentionally left vague (so that they can cover a huge variety of situations), those in charge of local copyright guidelines tend to issue rules far more stringent than those obviously required by law. This new report hopes to show educators that by learning a bit more about copyright, they can have confidence in crafting a broad array of teaching tools and classroom assignments, even when those involve bits of copyrighted work.

Examining the security improvements in Leopard

There have been several articles on Leopard's new security features popping up on various Mac websites but, so far, they've all been little more than rewrites of the security section in Apple's list of 300 new Leopard features. However, Rich Mogull's How Leopard Will Improve Your Security on TidBITS goes much further. HangZhou Night Net

Interestingly, Rich starts by touting Time Machine as a big security win. A good way to keep your data from prying eyes is to delete it—don't forget to "erase free space" with the appropriate security options in Disk Utility, though—but that also kind of defeats the purpose of having data in the first place. Time Machine makes sure you get to keep your data to secure it another day.

The next improvement that Rich points out in Leopard is "stopping buffer overflows." Well, that's not actually what Leopard does. Even in Leopard, writers of applications, libraries, and operating system components can still write code that fails to restrict input data, allowing it to be written beyond the memory buffer set aside for this it. Therefore, buffer overflows are still possible. But the whole point of a buffer overflow exploit is to get the system to execute code sitting in that excess data—"arbitrary code" that can do something on behalf of the attacker. What Leopard does is randomize the location of various libraries in memory. This means that the attacker can't simply make the program jump to a known library location as part of the next step in its attack. Library randomization isn't foolproof—an attacker can still get lucky or be very persistent—but it certainly derails the vast majority of buffer overflow attacks.

The article goes on to talk about "identifying and defanging evil apps" in the form of tagging downloads, explains how vulnerable system components run in a "sandbox," and more. Personally, I'm very interested to see what the firewalling improvements amount to. Applications can be firewalled individually in Leopard, but it's unclear at this time how fine-grained that control is.

Using antennas to see really small stuff

A lot of the recent developments in microscopy have centered on visible light (400-650nm) or near-infrared light (700-2500nm). This is because detectors are most sensitive to visible and near-infrared light and most commercial lasers operate in this wavelength range. The problem is that nothing interesting happens in this wavelength range. Most objects are reasonably transparent to light over the visible and near-infrared ranges, so images are generally created by labeling a region of interest with florescent materials, which then glow in the presence of the laser light. Another problem with microscopy is the diffraction limit, which tells us what the smallest resolvable image is. In most cases, this is something like the wavelength of the light (around 400nm) and that is too big to be able to resolve individual proteins or DNA molecules. Microscopy, using mid-infrared light and optical antennas to beat the diffraction limit may enable high resolution microscopy that can also identify the chemical it is imaging. Here, we report on some recent progress in developing the tightly focused light source required for such a microscope. HangZhou Night Net

As we have discussed in other articles, there are methods for defeating the diffraction limit. For example, light can be guided in some structure that is tapered to a tip whose dimension is much smaller than the wavelength of light (say 10nm). If the outside of the tip is conductive, the light excites the electrons, causing them to collectively vibrate down the guiding structure. At the end of the structure, the electrons release the energy as light, as if it had been conducted down the structure. However, the light is emitted in every direction and is only very intense right at the end of the tip. The intensity of the scattered light can be used to map a surface with a resolution about the same as the tip diameter.

Using this and similar techniques, scientists could, in principle, resolve an individual protein molecule. The difficulty is that the protein is transparent to the light used and if we use a florescent label, we are imaging the label not the protein. In other words, labels are very useful when looking at populations of proteins (or other molecules) but are of more limited use when studying individual molecules.

Enter the quantum cascade laser, which is a unique class of laser that emit in the mid-infrared (3-5 micrometers). These lasers use a very finely structured semiconductor to weakly confine electrons in very small boxes. The boxes give the electrons a set of well-defined energy levels to occupy. When a voltage is applied, the electrons travel from box to box in such a way that they must transition down an energy level with each move. For every transition, they release a photon of light and the presence of photons can stimulate electrons to make the transition, hence a laser is born. The difference is that the wavelength of these lasers are limited only by the physical dimensions of the boxes, meaning that we are no longer stuck with laser light colors given to us by nature. Quantum cascade lasers have found their niche in the mid-infrared and infrared (3-15 micrometers), where they make a lovely reliable source for people wanting to do spectroscopy.

The thing that makes this interesting is that almost every molecule in existence absorbs somewhere in the mid-infrared, making mid-infrared spectroscopy a key tool for identifying and understanding molecules. The problem is that the diffraction limit means that you can only resolve objects around three micrometers big. In principle, the quantum cascade laser could be used to detect the absorption from a single protein molecule, but it can only tell you where that molecule is to within three micrometers.

Now a group of researchers from Harvard, with support from Agilent, have combined the ideas used for high resolution imaging with quantum cascade lasers. To do this, they deposited a couple of metallic strips on the emitting face of the quantum cascade laser, forming an antenna. This metallic layer absorbed a lot of the light from the laser, causing the electrons to oscillate coherently. The light emitted from the gap between the strips is very intense because it gets most of the energy from the antenna. However, it also radiates in every direction, so the intensity is only very high near the gap. Imaging with such a laser will reveal features on the order of the gap size, which is about 100nm. This is still too big to reveal single proteins, but is certainly much smaller than most microscopes operating in the mid-infrared.

Now, there is a downside to this. Unlike normal laser diodes, quantum cascade lasers aren’t really that tunable. If you ask for a quantum cascade laser with a wavelength of five micrometers, that is what you will get. Unfortunately, spectroscopy really requires accessing a broad range of colors, all in the mid-infrared. This means that the light source will have to be different if this is to be employed as a generalized microscopy tool. However, there are plenty of applications where the ability to image the locations of a few key chemicals would be required to obtain useful information. There is certainly room for a specialized instrument utilizing this technique.

Applied Physics Letters, 2007, DOI: 10.1063/1.2801551

ICANN probing “insider trading” allegations with domain name registrations

The Internet Corporation for Assigned Names and Numbers (ICANN) has begun an investigation (PDF) into accusations that some insiders may be using inside information to collect data and purchase unregistered domain names that get a lot of DNS lookup requests—nonexistent domains that surfers frequently try to access. ICANN refers to the practice as "domain name front running," adding that it—along with several registrars and intellectual property attorneys—has received a number of complaints from registrants suggesting that such a thing has occurred. While the organization currently has no solid evidence on the matter as of yet, it feels that an investigation is warranted in order to nip in the bud any perceptions that the domain name industry is involved in unethical activity. HangZhou Night Net

ICANN's Security and Stability Advisory Committee (SSAC) likens the practice to stock and commodity front running, which occurs when a broker makes a personal stock purchase based on inside information before fulfilling a client's order. An insider to one of the popular domain registrars can see which domain names are popular with visitors, even if they are not yet registered. That person can then register the domain, knowing how much traffic it could get before the general public does, with the intent to resell it at a profit later.

While the practice is illegal when it comes to stocks and commodities, it is much more cloudy when it comes to domain names. ICANN recognizes the lack of regulation covering this area and makes it clear that a stronger set of standards needs to be established. "ICANN's Registrar Accreditation Agreement and Registry Agreements do not expressly prohibit registrars and registries from monitoring and collecting WHOIS query of domain name availability query data and either selling this information or using it directly," writes the SSAC. "In the absence of an explicit prohibition, registrars might conclude that monitoring availability checks is appropriate behavior."

The SSAC report comes just a day after news leaked that Verisign, a major root name server operator, was considering selling access to select DNS server lookup data. DomainNameNews first broke the story, saying that sources had indicated the company would provide "lookup traffic" reports on specific domains. The sources also said that pricing for the service was not known, but that it could cost up to $1 million per request.

The SSAC is now calling for public discussion of the situation in hopes of gathering more data and coming up with standardized practices on how to manage it. The committee suggests that those involved with domain name registrations "examine the existing rules to determine if the practice of domain name front running is consistent with the core values of the community." If registrants continue to find what they consider to be evidence of the practice, SSAC requests that users submit incidents to [email protected] with as much information as possible, including specific details of domain name checks and copies of any correspondence with the believed to be engaged in domain front running.

Microsoft antes up $240 million for a piece of the Facebook action

All of the recent flirting between Facebook and Microsoft has turned into hot equity action, as the two companies have announced that Microsoft will make a $240 million investment in the social networking site. In addition, Microsoft will begin selling ads for Facebook outside of the US and will become the site's exclusive ad provider in the US. HangZhou Night Net

Facebook's value is not in the software itself—which could be duplicated relatively easily by a small group of programmers—but in the vast social networks the site has gathered, networks that contain information about people's interests and desires that would be invaluable for any marketing company.

Launched in early 2004, Facebook was originally targeted to college students, limiting registrations to those with a .edu e-mail address. The company opened the registration doors to all comers in September 2006, and the move appears to have paid off: the site is drawing an average of 250,000 new registered users every day, according to Facebook. Facebook now has over 49 million active users, according to VP of operations Owen Van Natta.

Just a couple of weeks before removing the college-students-only registration limitation, Facebook and Microsoft inked an advertising pact that made Microsoft the exclusive banner ad provider. The companies extended that agreement through 2011 earlier this year.

Google had also been rumored to be courting Facebook, but Microsoft appeared determined to close the deal. Google already has an exclusive $900 million pact with MySpace to provide that site—and other Fox Interactive Media properties—with contextual ads and search services. Yahoo has also courted Facebook in the past, but the $750 million to $1 billion offers were apparently not enough to scratch Facebook's financial itch.

Microsoft's $240 million investment is part of a new round of financing for Facebook, one that places a $15 billion valuation on the company.