Wednesday, February 28, 2007

The Great Cache Hunt

The Great Cache Hunt

Windows - Past ArticlesThe Great Cache HuntKnock out the cache files that are slowing down your PC.By Joe DeRouen
As regular readers probably know, I'm a big supporter of Firefox. Their browser has proven to be leaner, quicker, and safer for Windows users than every one of their competitors, Microsoft's very own Internet Explorer included.
Have you ever gone to a Web site to find an article, story, or a piece of information that you remember reading, only to find that the relevant piece is missing--or worse yet, the entire site--has gone missing? That can be a frustrating experience, especially if you can't find the information anywhere else.
Enter Resurrect, a Firefox extension that just may save the day for you. The browser add-on allows you to search the Internet Archive (otherwise known as the Mr. Peabody and Sherman-inspired Wayback Machine,) MSN Cache, Yahoo Cache, CoralCDN, and the Google Cache for an archived version of the missing page or Web site. If the first one doesn't have the information you seek, chances are one of the other four will. You can activate Resurrect a number of ways, including right-clicking on a missing-page error message or selecting it through the Tools menu. Once you launch the plug-in, you can open the missing page in the same window you're currently browsing, a new tab, or an entirely new window. Sure, you can go to any of these services without this extension and manually enter the URL of the missing Web page yourself, but this extension is a great resource that makes finding that missing information just a little bit easier.
Resurrect is a free download.
Will Swap Cache for Windows Unlike the Internet Archive, you don't want your PC to save everything in its cache. In fact, you should periodically clear your Windows cache to prevent system slowdown. And that's where MemTurbo 4 comes in. This program helps you not only manage your PC's memory but also trains your computer to better use available RAM.
Cache, in this case, is an area of physical memory reserved by Windows to store the most recent operations and file requests. When a file is read from the disk, it's stored in the system cache. If there is another read request for the same file, Windows can retrieve it from the faster system cache memory instead of reading it again from the slower hard disk, which, in theory, will save you time.
File caching, however, can behave erratically after a period of time, corrupting your available RAM. Memory resources available to run applications will reduce and eventually bloat cache memory. In this situation, the system's performance may degrade, forcing you to restart your computer.
MemTurbo helps prevent that by sealing RAM leaks from poorly written programs, removing unneeded applications from your PC, and installing a memory monitor that keeps everything running smoothly and efficiently. Once the program is installed, the running apps will be able to use allocated memory without forcing Windows to swap from the cache.

The program is easy to install and, once it's loaded, you probably won't even notice it's there. An improved task manager helps you to better monitor processes in memory, while also supporting better graphic displays to improve performance of 3D games and graphics-intensive applications. And if you're a power user who wants to have more control under your PC's theoretical hood, you can do that too; MemTurbo allows you to tweak what parts of memory are allocated where, leaving you to decide your own destiny in terms of how your PC runs.
A 15-day trial version is available, and the application costs $19 to register.

Tuesday, February 27, 2007

Digital Media Distribution Opportunities for the Film Industry

Digital Media Distribution Opportunities for the Film Industry

Technology advancements such as those in Windows Media 9 Series are enabling new distribution opportunities for the film industry including online, on CDs and DVDs and in theaters.

Growing Options for Viewing Films:
The PC as an entertainment hub is fast becoming a reality with increased processing power combined with a fast broadband connection, connectivity to a variety of displays, and increases in the compression/decompression of high-end audio and video. These new capabilities open up an opportunity and a challenge to film distributors: how to target this new digital entertainment gateway with digital movies and video but not lose control of the content in the process. Already today some estimates say there are as many as 500,000 digital movies being exchanged illegally over the web. How can technology help to bridge the gap between what consumers want (find, acquire, playback and share movies online) and what the film industry wants (secure content, business models that work, a great consumer experience)? Advancements in digital media technology are opening up new distribution opportunities for the film industry. In order to take advantage of these new opportunities the film industry requires the ability to secure valuable assets, deliver them to customers and ensure a high quality playback experience on par with other playback options such as watching a DVD in a home theater or a pay-per-view movie on cable. Technology such as Windows Media 9 Series is being developed to meet those requirements and open up new distribution options. This whitepaper discusses key features in Windows Media and how they are enabling three distribution channels for the film industry: the Internet, CDs and DVDs, and digital media enabled theaters.

Internet Distribution:
The advancements in Internet digital media distribution have happened so quickly. The first generation of streaming came online around 1994 with the first upsurge in Internet usage. This experience was audio only and bad quality audio at that. But the potential was realized by technology pioneers and teams of developers worked to get higher quality into the small file sizes needed to be able to transport the data in a stream in real-time to the user.The second generation of streaming is what we’re familiar with now. Good audio quality in reasonable file size and acceptable video quality when played back in a small window. The second generation of digital media streaming also introduced digital rights management, the ability to secure content and associate it with licenses that authorized the playback. The third generation of digital media on the Internet is where Microsoft is now focusing development efforts. This new technology will meet the requirements of the film industry in the following areas:

Security – The third generation will include more robust digital rights management solutions to secure the delivery of digital media.

Quality – The consumer needs to have a high quality experience, similar to what they’re used to getting when watching movies at home on TV both in the video quality and in the quality of the delivery.

Improved economics – With technology providers like Microsoft focusing on creating digital rights management technology to secure the content and building the technology to deliver a high quality consumer experience, the film industry can focus their efforts on creating business models for distributing content online. Windows Media 9 Series was built around these requirements and includes some new features that directly impact these areas.

No More Buffering Delays:
A new feature in Windows Media 9 Series called Fast Streaming delivers an "instant-on " streaming experience for broadband users, effectively eliminating the buffering delays that consumers experience with streaming video today and offering a more TV-like viewing experience with the ability to quickly channel surf around video content on the web. This also eliminates the buffering users get when an ad is inserted into a video stream. Fast Streaming also automatically optimizes the delivery of streaming audio and video to take advantage of the full bandwidth available to the user, which vastly reduces or eliminates the impact of congestion on the Web for broadband users.

High Quality Audio and Video:
Codec improvement is an ongoing process. The new Windows Media 9 Series audio and video codecs improve quality approximately 20% without increasing the file size. This means online film providers can either increase their current quality levels or decrease their current bandwidth costs by switching to the new codecs. Combining Fast Streaming with the new audio and video codecs brings a greatly improved online video experience to consumers and makes online distribution of films via video on demand services even more attractive to consumers and film distributors.

Advanced Encoding Techniques with Windows Media 9 Series

Advanced Encoding Techniques with Windows Media 9 Series

Introduction:
Working with digital media is an art, not a science, so be prepared to practice, test, and tweak to achieve the highest quality. This document provides tips that you can follow to ensure that you start with the best-quality content possible before you begin encoding. It also provides information about techniques that you can apply in the encoding session to ensure that you end up with high-quality encoded content.

Capturing Quality Content:
This section outlines topics to keep in mind as you prepare to capture your audio and video content.
The following points are explained in detail throughout the rest of the section:
Capturing to an AVI File. For the best quality, avoid combining the capturing and encoding processes. Instead, capture to an AVI file first, and then encode.Comparing Audio and Video Sources. Keep in mind that some audio and video sources are better than others. For the best quality, capture SDI video and digital audio. Setting Proper Audio and Video Levels. Set your video and audio levels properly before you start capturing. Optimizing Your Computer. Check that your computer is optimized.
Capturing to Optimal Pixel Formats. Capture to a YUY2 pixel format to avoid color conversions during encoding. Capturing Optimal Resolutions. Capture video at either a resolution of 320×480 or 640×480.

Capturing to an AVI File:
To ensure the highest-quality results, it is recommended that you capture to an AVI file before encoding. Doing so has the following advantages:
*. Removes any issues related to the processor falling behind the capture process, and enables the encoder to optimize all calculations.
*. Enables the use of editing programs to perform steps such as trimming the start and end times of the file, or doing color correction.
*. Simplifies batch encoding when you source from an AVI file.

Comparing Audio and Video Sources:
It is important to start with the best-quality source. This section lists possible sources, in the order from best to worst:
Serial digital interface (SDI) video. Used for digital video cameras and camcorders. Because the content stays in a digital format throughout the capturing and encoding processes, this results in the least amount of data translations, and results in the best-quality video.
Component video. Used when sourcing from DVDs. With this source, the video signals are separated, for example, into the RGB or Y/R-y/B-y format. Results in good-quality video.
S-Video. Used for S-VHS, DVD, or Hi-8 camcorders. The video signal is divided into luminance and chrominance. Results in good-quality video.
DV video. Used with DV devices, such as MiniDV digital camcorders connected through an IEEE 1394 video port. Results in good-quality video.
Composite video. Used for analog cameras, camcorders, cable TV, and VCRs. Composite video should only be used as a source as a last resort. With composite video, luminance and chrominance components are mixed, which makes it difficult to get good-quality video.
Audio. If possible, capture digital audio. If you must capture audio from an analog source, balanced audio connections are better than RCA.

Setting Proper Audio and Video Levels:
To set audio and video levels properly:
*. Adjust your video monitor using SMPTE color bars, and then adjust your computer monitor to match, using a high-resolution bitmap of the SMPTE bars.
*. Adjust your video capture card levels (hue, saturation, and brightness), so that the picture matches the video monitor.
*. Check and normalize all audio levels in your system. Use a professional-grade audio card, such as the Echo Layla24 or the M-Audio Delta Series.
*. If possible, use a digital waveform monitor.

Optimizing Your Computer:
Before you start capturing, optimize your computer using the following steps:
1. Defragment your hard disk.
2. Turn off network and file sharing.
3. Close all other programs, especially if a program accesses the hard disk.
4. Monitor system resources, making sure that the computer is sufficiently powerful to keep pace with the data feed.
5. During the capture, watch for frame dropping. It should be possible to capture an entire movie with no dropped frames.
6. Watch for direct memory access (DMA) buffer conflicts between the capture card and the SCSI card, which can result in frame dropping. This is less likely to occur now than in the past. If conflicts occur, one solution is to use a dual PCI bus motherboard configuration, in which the capture card and the SCSI card are on different buses.

Capturing to Optimal Pixel Formats:
It is recommended that you capture to a YUY2 (4:2:2) pixel format, which enables you to avoid pixel format conversions during encoding. The Windows Media Video 9 Series codec is primarily a 4:2:0 pixel format, except that if you choose to maintain the interlacing in your content (a new feature with Windows Media Encoder 9 Series), then a 4:1:1 pixel format is used. Because the YUY2 format is a superset of both 4:2:0 and 4:1:1 pixel formats, the content can be converted to either format without any data loss.An important note is that if you capture to a 4:2:0 AVI file (for example I420, YV12, or IYUV), you will not be able to maintain the interlacing in your source video.Older capture devices may create AVI files that do not fully conform to published specifications, resulting in upside-down video with the YUY2 pixel format. To prevent this, you can either set the driver on your capture device to use a different pixel format, or you can "flip" the image if your driver provides such a feature. Finally, there is also an option to flip the video in the encoder.

Capturing Optimal Resolutions:
If you capture 320×240 to an AVI file, the capture card throws away one of the fields, which effectively deinterlaces the video. If your target audience plays the video at 320×240, this usually produces acceptable results. However, to ensure the highest quality, you should capture both fields, so that you can use Microsoft Windows Media Encoder to deinterlace the video or apply the inverse telecine feature. Deinterlacing and inverse telecine require both fields of a frame to be present in order to function properly. For this reason, it is recommended that you capture either at 320×480 or 640×480. After deinterlacing or the inverse telecine filter is applied in the encoder, output video encoded at 320x240 will have higher quality.

Friday, February 23, 2007

Encryption: we know we need it—so now what? Encrypting backed up data stored to tape or other mobile media

Encryption: we know we need it—so now what? Encrypting backed up data stored to tape or other mobile media

Anyone in IT who's read the headlines understands that encrypting data is moving from optional to obligatory, and anybody who's not thinking about it now should be. Stored data that can be moved off-site--sometimes referred to as data at rest--is the most vulnerable. Once data has been backed up, it has to be stored, and that job may be handed off to a third-party business that securely stores data off-site, such as Iron Mountain. Regardless of who handles long-term storage, this data may be stored for years. That's a long time for an organization's data to be left unattended, so this data needs to be encrypted.

The next step is to figure out how to evaluate available encryption solutions. A few criteria are pretty easily identified:
* Robust Security: It makes sense to implement the strongest encryption method from the array of available options. The strength of encryption depends on the algorithm used, and AES-256 encryption is the gold standard. The Advanced Encryption Standard (AES) is approved by the National Institute of Standards and Technology (NIST) for use in protecting federal information. AES can be implemented with any of three key sizes: 128-bit, 192-bit, and 256-bit. The more complex the key, the harder it is to break the encryption; so AES with a 256-bit key length renders the algorithm unbreakable.

* Key Management: The hard part about encrypting data is not how to encrypt it--it's how to manage it. If you don't keep the keys safe, your encryption plan is ineffective. If you keep the keys too far out of reach, you can't decrypt your data, which renders your encryption plan impractical. So a complete key management application--that helps you manage and protect data and keys, while helping you safely match encrypted data with the right key--should be a requirement for any encryption system you're considering.
* Price: Most data centers have a limited budget and a maximized workload, so the selected encryption method needs to be affordable and simple to implement and manage, which limits administrative overhead and expense.
In addition, evaluate performance and any unique factors that a specific encryption solution might offer. With this framework, you can assess available encryption solutions.
What are the Choices?


AES encryption for stored data can be implemented at several locations in the data path as data moves from primary storage to a stored state:
* Just before data is sent to the server running backup software--for example, by a network encryption appliance.
* While the data is being processed by the backup software.
* After the data is formatted by the backup software, a network encryption appliance can encrypt data before it's sent to the library.
* The library, where the data is written to tape or other portable media. (Tape drives do not yet provide encryption.)


Network Encryption Appliances :
Some sites encrypt data across the entire network using network encryption appliances, such as those from Decru and NeoScale. These appliances can also be dedicated to encrypting stored data. Appliances can encrypt data before or right after data is processed by the backup software.
Advantages
* Robust Security: AES-256 encryption. This option provides encryption across the widest area, since it can also handle encrypting network traffic.
* Key Management: Supplies key management along with the hardware-based encryption.
* Performance: Uses fast hardware-based encryption that offloads the backup server from computation-intensive encryption processing, so that the server performance isn't affected; it also provides compression.
* Unique Factors: Certified at various levels with the Federal Information Processing Standards (FIPS) that specifies data security--specifically, FIPS 140-2.
Disadvantages
* Price: Can be costly. This may be warranted for high-security sites, but for many, cost may be a barrier. They are also very costly to scale, and may be overkill given the incremental data growth that data centers typically manage.
* Ease of Implementation and Management: Introducing another set of interfaces, limitations, management complexities, and another support/service-level agreement. These are added to management responsibilities for backup software and hardware. Cost is also increased by the appliance's use of data center space, which is particularly expensive in metropolitan areas.
* Possible security issue: If the appliance is used before the data is processed by the backup application, check how file data is stored. Some backup software applications leave file data in cleartext (un-encrypted), which can leave the file names exposed--a possible risk.
Encryption through Backup Software
Backup software can also encrypt data as it's backed up.

Advantages:
* Price: It's easy to scale software by simply purchasing additional licenses. Also, support for the encryption module may be more expensive, but no additional vendor contract is necessary.
* Ease of Implementation and Management: You've already got backup software, you're already using it, and you can keep on using it when you use it to encrypt data. An additional encryption-specific module may be added, but you won't have to learn new interfaces.

Optical storage remains a top choice for compliance

Optical storage remains a top choice for compliance

The business requirements for a record archive have evolved very rapidly over the last few years. Major financial scandals and recent incidents involving large-scale data loss have turned the spotlight on the management of digital archives. An increased awareness of the value and liability of archive records has resulted in both industry regulation and internal operational risk management.
The requirement for tighter integration of archive policies within an IT infrastructure is creating a demand for more flexible strategies that can accommodate the new regulatory and risk management burden. This need for flexibility is particularly important in the choice of physical storage media since it will, in large part, determine the success of implemented policies. Currently, the two preferred advanced-technology choices for archiving are UDO (Ultra Density Optical) with "True" WORM (Write Once Read Many) storage and disk-based technology such as WORM storage with a CAS (Content Addressable Storage) interface. Each performs the same end function, but the method, cost and effectiveness are not equal. This can be illustrated by examining common archive objectives that are tightly linked to storage media attributes, including record authenticity, record destruction, and TCO (Total Cost of Ownership).


UDO (Ultra Density Optical) offers "True" Write Once technology implemented at the physical media level. The recording surface of True Write Once UDO media allows files to be written, but the media itself cannot be physically erased or modified. This technology is significantly different than magnetic disk and tape emulation since the Write Once properties of UDO are inherent to the recording surface of the media and are not a function of software or firmware controls.

If archiving on a typical RAID system, a simple delete operation does not remove the data from the disk. The only way to physically destroy records is by repeatedly overwriting the targeted sectors with a patterned sequence to ensure no residual trace of the document remains on the media.
The US Department of Defense has an often-quoted specification for data shredding on magnetic disk media. The specification (DoD 5220.22-M), which has been implemented in some specialized CAS interface products in the context of a record retention policy, states that depending on the source of the recommendation, targeted sectors should be overwritten between three and 35 times.
By contrast, UDO offers a Compliant Write Once media format designed specifically for data disposition requirements. Compliant Write Once UDO operates like standard WORM media, but has the ability to physically destroy targeted files through the use of a special "shred" operation. This is a one-pass function that provides full verification and unlike the erase pass on magnetic disks, the shred procedure on UDO media leaves no residual traces of previously written files. Compliant Write Once UDO media enables record level retention management with an extremely high standard for physical record destruction.

Application and desktop virtualization

Application and desktop virtualization

Despite rumors to the contrary, virtualization is not just for the datacenter. From the most complex workstation applications to the simplest DLLs, virtualization is leaving an indelible mark on client computing.
A good example of this is application virtualization, a label applied to products that insulate running programs from the underlying desktop. The idea behind application virtualization is to eliminate many of the support-draining configuration problems that plague conventional desktop implementations. These products virtualize the interaction between a given program and supporting OS resources, like the file system and, in the case of Windows, the system registry database. All these products isolate applications from the OS image, but the approaches are quite varied.
Application acrobaticsAt one end of the spectrum are products like Altiris Software Virtualization Solution (SVS). Tools like SVS employ what might be called the “brute force” method: A simple filter driver is installed in the Windows file system code stack to intercept and redirect I/O calls from SVS-managed applications. When enabled in their respective “layers,” an SVS-managed application appears to integrate seamlessly with the OS. In reality, every aspect of the application’s OS interaction, from loading a DLL to accessing a registry key, is being redirected on the fly to a local cache file managed by SVS.
The advantage to this approach is that it fully isolates the OS from the application: Any changes made by the application – to the Registry, to its own files, to Windows – are in fact occurring solely within the SVS-managed cache file. Since no real changes are occurring, the underlying OS image remains pristine and the application can be “disabled” by simply clicking a button or by remotely disabling it from a supported management console. The downside to this approach is that it has trouble managing multiple versions of the same application; for example, Microsoft Office can sometimes trip up SVS by invoking the wrong version of a component when multiple versions are installed in parallel layers.
At the other extreme you have solutions like Softricity’s SoftGrid (recently acquired by Microsoft and soon to be integrated with the base Windows Server platform). SoftGrid provides a complete virtualization environment: Applications are streamed to the client from a server share and then executed within a customized “sandbox” that completely isolates the code from the OS. The advantage to this approach is that it avoids many of the multiversion issues that plague SVS. However, the trade-off is a more complicated deployment process that requires administrators to create a custom installation image to optimize the code base for streaming.
Of course, no market segment is complete without an interloper to shake things up. Thinstall combines the simplicity of SVS with the fully padded box approach of SoftGrid. By embedding both the virtualized environment and the application image into a single executable file, Thinstall eliminates the need for supporting infrastructure: Just copy or stream the file to the client and execute. No agent is required and the image can be deployed using virtually any traditional management suite, including Active Directory and Microsoft Systems Management Server. The downside is the need to customize the application using Thinstall’s Virtualization Suite toolset.
Classic virtual machinesIn some client situations, a more comprehensive virtualization solution is required, such as hosting a legacy application on a new operating system. In that case, it may be best to isolate an application within a complete, virtualized OS environment — the classic “virtual machine” approach. This enables you to run an application within the OS image of your choice while still supporting migration to, and integration with, newer or otherwise incompatible OS platforms.
VMware and Microsoft dominate the classic VM market, with VMware the more visible of the two. Efforts like the VDI (Virtual Desktop Initiative), a consortium of vendors promoting virtualization as a desktop and application management solution, are being driven primarily by VMware.
VMware has also been quick to embrace new CPU and hardware technologies, such as 64-bit processing and expanded memory for next-generation applications. VMware exclusives, such as the ability to take snapshots of a VM’s running state and “roll back” to a saved image have earned affection from the developer community. But in the end, VMware’s willingness to expose its underlying virtualization technology to the masses may pay the biggest dividends.

Skype petitions FCC for open cellular access

Skype petitions FCC for open cellular access

Skype petitioned the Federal Communications Commission earlier this week to force U.S. mobile operators to loosen controls on what kinds of hardware and software can be connected to their networks.
In a document dated February 20, Skype asked the FCC to apply to the wireless industry what is known as the "Carterfone" rules, which would allow consumers to use devices and software of their choice on cell phone networks.
Skype's motivations are clear. The company has created software that allows people to make free phone calls across the Internet. And now it wants users who
access the Internet via a mobile device to be able to use their software and services, too.

"We want to allow our users to use the Skype software where ever they are," said Christopher Libertelli, senior director of government and regulatory affairs for Skype. "And we want to make sure the policy is set in the right direction so that when Skype users want to use it on mobile devices, they'll be able to."


The "
Carterfone" rules, which were enacted in 1968 during the old AT&T's monopoly of the phone industry, allow consumers to hook any device up to the phone network, so long as it did not harm the network. Prior to these rules, AT&T provided all telephones and devices connected to the telephone network, and it routinely sued companies that sold unauthorized products that could attach to the network.

The rules helped spur new innovations, such as the fax machine and Internet modem. In more recent times, the principle has been extended to other communication networks, such as cable modem and DSL. This has paved the way for companies such as Linksys to sell wireless routers.

But the principle has not been applied to cellular networks. As a result, the market has evolved into one that is
heavily controlled by carriers. They dictate which phones are used on their networks, what content users can access, and which applications can run on phones. Some have even included specific terms in their service contracts that prevent customers from downloading and using software from Skype on their networks.
Tim Wu, a law professor at Columbia University,
published a report earlier this month also arguing the "Carterfone" rules should apply to the cellular industry, because otherwise carriers exert too much influence on the design of the devices and the applications that run on them.

"They have used (their) power to force equipment developers to omit or cripple many consumer-friendly features," he writes. "Carriers have also forced manufacturers to include technologies, like 'walled garden' Internet access, that neither equipment developers nor consumers want. Finally, through under-disclosed 'phone-locking', the U.S. carriers disable the ability of phones to work on more than one network."
Not surprisingly, the cell phone industry's trade organization, CTIA, doesn't agree with Skype or Wu that regulations are needed.

"Skype's self-interested filing contains glaring legal flaws and a complete disregard for the vast consumer benefits provided by the competitive marketplace," Steve Largent, chief executive of the CTIA, said in a statement. "The call for imposing monopoly era Carterfone rules to today's vibrant market is unmistakably the wrong number."

"At the end of the day, bits are bits in the Internet," said Dave Passmore, a research director at the Burton Group. "So Verizon or anyone else who wants to tell someone they can't download a VoIP client onto their phone from the Internet is going to have a very hard time enforcing it."

Thursday, February 22, 2007

AQUA: An Amphibious Autonomous Robot

AQUA: An Amphibious Autonomous Robot

AQUA, an amphibious robot that swims via the motion of its legs rather than using thrusters and control surfaces for propulsion, can walk along the shore, swim along the surface in open water, or walk on the bottom of the ocean. The vehicle uses a variety of sensors to estimate its position with respect to local visual features and provide a global frame of reference.

The aquatic environment is almost ideal for autonomous robot development. First, it provides a range of real tasks for autonomous systems to perform, including ongoing inspection of reef damage and renewal, tasks in the oil and gas industry, and aquaculture. Second, operating in the water requires robust solutions to mobility, sensing, navigation, and communication.
A common theme of many industrial aquatic tasks is site acquisition and scene reinspection (SASR). Figure 1 shows AQUA performing a typical SASR task, in which it walks out into the water under operator control and is directed to a particular location where it will make sensor measurements. Once near the site, the robot achieves an appropriate pose from which to undertake extensive sensor readings. After making the measurements, the robot returns home autonomously. Later, the robot autonomously returns to the site to collect additional data.

Wednesday, February 21, 2007

Algorithm helps computers beat humans at Go



Algorithm helps computers beat humans at Go

Computers can beat some of the world's top chess players, but the most powerful machines have failed at the popular Asian board game Go, in which human intuition has so far proven key. Two Hungarian scientists have come up with an algorithm that helps computers pick the right move in Go, played by millions around the world, in which players must capture spaces by placing black-and-white marbles on a board in turn.
"We are not far from reaching the level of a professional Go player," Levente Kocsis of the Hungarian Academy of Sciences computing lab Sztaki said.
The 19x19 grid board that top players use is still hard for a machine to use, but the new algorithm is promising because it makes better use of the growing power of computers than did earlier Go software.
"Programs using this method immediately improve if you use two processors instead of one, say, which was not typical for earlier programs," Kocsis said.
Whereas a chess program can evaluate a scenario by assigning numerical values to pieces--9 to the queen and 1 to a pawn, for example--and to the tactical worth of their position, that technique is not valid for Go.
In Go, all marbles have an identical value, and scenarios are more complex, so the computer has to think about all potential moves through the end of the game and emulate the outcome of each alternative move.
Even the most powerful computers have failed at that task, but Kocsis and colleague Csaba Szepesvari have found a way to help computers focus on the most promising moves, using an analogy with slot machines in a casino.
Punters will find that certain one-armed bandits in a casino appear to pay more on average than others, but an intelligent player should also try machines that have so far paid less, in case they are hiding a jackpot, Kocsis said.
The key is to find the balance between the two sorts of machine.
Go software using a similar method, called UCT (Upper Confidence bounds applied to Trees), does not have to scan all possible outcomes of a game and can quickly find the best mix of scenarios to check.


"This bandit algorithm has proven advantages," Kocsis said.
The possible outcomes of a game are like branches of a tree, and earlier Go programs, unable to scan all branches, picked some at random and tried to find the best move from that sample.
The UCT method (PDF) helps a computer decide which routes are most worth investigating. Programs based on it have consistently won games against most other machines, according to Kocsis.