Five Technologies You Need To Know About

Keep your eye on these innovative hardware and software solutions with the potential to change all the rules.

May 3, 2006

18 Min Read
NetworkComputing logo in a gray background | NetworkComputing


Prediction is very difficult, especially about the future.
-- Niels Bohr, physicist

If hindsight is 20/20, the ability to look forward in time is utterly myopic. It's dead easy to forecast that technology will alter the way we work in the future. It's nearly impossible to predict which, why, when, and how specific technologies -- both hardware and software -- will bear fruit to make this happen.

But it's no fun to sit on the sidelines and watch the future unfold without taking a stab at being a digital Nostradamus. So we're taking our best guess as to which emerging technologies will have the most impact on computing in the very near future.

Over the following pages, you'll read about five specific advances in hardware, software, and Internet tech that should be on every technophile's, Web strategist's, and CTO's radar. These technologies promise such increased efficiency, performance, and functionality that ignoring them would be foolhardy. At the very least, you need to know about them now so your face doesn't go slack when your co-workers bring them up at lunch.

The items on our list aren't a pack of pie-in-the-sky fantasies, either. We expect to see these technologies actively in play by the end of this year. In a few instances, they are already changing the rules of how we compute at both the consumer and corporate level.Enough with the introductions. Let's get on with the list.

Five Technologies You Need To Know About

•  Introduction•  Ajax: The new face of the Web•  Core: Intel's next-gen processor architecture•  Holographic storage: The ultimate data storage solution?•  NAND drives: Flash memory hard disks

•  AMD-V and VT: The next step in virtualization


Ajax

What is it? Ajax is a methodology that fuses existing Web development technologies to produce highly functional and responsive Web sites and services.

Why is it worth watching? Two words: Web services. It is difficult to understate Ajax' impact on both Web development and Web browsing. As developers have come to embrace it, our expectations of what a Web site can do are undergoing yet another profound shift. Building Web sites that look and feel like desktop applications is one of the key reasons Ajax is amazing. Faster online experiences is another.

When is it coming? It's already here; in fact, you've probably already used dozens of Ajax-enhanced Web sites.

Tell me more. As far as buzzwords go, Ajax takes the cake. Everyone is talking about it. Everyone is doing it, or thinking about doing it. Google is doing it. Microsoft is doing it. Yahoo is doing it. Chances are, most of your favorite Web sites are doing it, even if you're not aware of it.

Short for Asynchronous JavaScript and XML, Ajax is not a technology per se. It's a philosophy. A style. A frame of mind, if you will. The functionality is twofold. By using JavaScript's client-based functionality and XML's ability to efficiently and directly deliver specific data, developers have been and will be able to build Web pages with the responsiveness of desktop applications.

The new world of Web services pretty much oweseverything to Ajax developers.

The key to developing in the Ajax style is efficiency in requesting and retrieving data. As reported on TechWeb, the idea is for sites to update their content in a background cache so that specific pieces information are ready to display as needed. Rich micro-interactions (managed through the combination of XML and JavaScript) mean that the entire page doesn't have to reload -- only the portion with fresh data actually changes.

This stands in marked contrast to the standard, old-school method of refreshing entire pages to present the user with the few new pieces of data requested.

Google Maps is a classic example of an Ajax Web application. Response times are fast, and the site is able to deliver a surprisingly rapid-to-respond experience that looks and functions more like a "traditional" localized app than a Web page.

Yahoo is also using Ajax in its Flickr photo service. And in the very near future, Microsoft will launch Windows Live (now in beta), an ambitious Web service that will offer users the ability to use several Windows-based applications from a browser window.

The new world of Web services -- another twenty-dollar buzzword with actual real-world potential -- pretty much owes everything to Ajax developers.

Five Technologies You Need To Know About

•  Introduction•  Ajax: The new face of the Web•  Core: Intel's next-gen processor architecture

•  Holographic storage: The ultimate data storage solution?•  NAND drives: Flash memory hard disks•  AMD-V and VT: The next step in virtualization


Core

What is it? Core is Intel's highly anticipated upcoming microprocessor architecture.

Why is it worth watching? For Intel, Core is huge because the chipmaking giant is betting the proverbial farm on this new processor architecture. For the enterprise and consumer sector, Core is worth watching because of the potentially major advances it could make on the all-important performance-per-watt scale.

When is it coming? Any day now, we hope. Intel has it on the company roadmap for the vague "second half of 2006" time period, but industry insiders have indicated we'll see Core in early summer.

Tell me more. When Intel abandons a code name in favor of a real-world name -- as the company did in March at the Intel Developers Forum when it announced that "Merom" processors would now be known as "Core" processors -- the event typically signifies a pending release. We can now safely assume that sometime in the next few months, Intel will release the first processors in this Core line. The impact of these processors bears extreme significance for both the mobile and desktop market.Besides the name change, three hyphenated words distinguish Core (also known by the more technical moniker Next Generation Micro Architecture, or NGMA for short) from the score of Pentium processors Intel has released over the years: performance-per-watt. After a few years of getting its butt kicked by AMD's lightning-fast, thermally efficient chips, the company is finally firing back.

While it isn't yet clear how big an improvement in performance and power consumption NGMA will put forth, Intel and many analysts are impressed with the potential.

After a few years of getting its butt kicked by AMD's lightning- fast, thermally efficient chips, Intel is finally firing back.

The first Core microarchitecture CPU releases -- code-named Conroe for desktops, Merom (now officially Core Duo, with models rumored to range from T5500 to T7600) for notebooks, and Woodcrest for servers -- will be built on a 65nm process. This allows a larger number of transistors to fit on a chip, which results in lower temperatures and faster performance.

Furthermore, these processors will have an efficient 14-stage pipeline, which circumvents the need for outrageous clock speeds, again resulting in reduced operating temperatures. (In contrast, the notoriously heat-intensive Pentium 4 Prescott had a 30-stage pipeline.) Core microarchitecture CPUs will feature Intel's Virtualization Technology (VT) and 64-bit extensions.The key to Core processors is their ability to work together in dual-core and multi-core configurations. Ars Technica, which has provided some early, in-depth details on and analysis of the microarchitecture, says, "Core's performance will scale primarily with increases in the number of cores per die and with the addition of more cache, and secondarily with modest, periodic clock speed increases." This makes it clear that Core is being built with stackability in mind.

The promise for the mobile market is clear; increased performance at reduced power consumption is always a win for mobile users. But the desktop market is also a key battleground. Desktop processors have become incredibly inefficient thermally. Higher temperatures mean decreased stability along with increased difficulty in raising performance. Being able to stack processors rather than ratchet up clock speeds will permit Intel to scale CPU performance with very few tradeoffs.

But with all this said, there are no certainties regarding Core's performance. Sure, it will be faster and yes, it will likely be more efficient. But how well will it scale into the future? In the past 12 months, AMD has made a gigantic dent into Intel's desktop market share. In a few months, we'll know for sure whether Intel has hit upon the CPU microarchitecture that will allow the company to reclaim the dominance it once possessed over AMD.

Five Technologies You Need To Know About

•  Introduction•  Ajax: The new face of the Web•  Core: Intel's next-gen processor architecture•  Holographic storage: The ultimate data storage solution?•  NAND drives: Flash memory hard disks•  AMD-V and VT: The next step in virtualization


Holographic Storage

What is it? Holographic storage is a technology that uses three-dimensional imaging to dramatically increase storage capacity on a disk. We're talking 515 gigabits of data per square inch -- nearly 10 times today's standard capacity.

Why is it so important? Think 300GB disks. In addition to promising massive storage capacities by breaking through the density limits of conventional data storage, holographic storage is also capable of producing considerably higher data transfer rates.

When is it coming? Toward the end of 2006.Tell me more. It's possible that the much-hyped Blu-ray and HD-DVD optical disc technologies could be obsolete just months after launch, if InPhase Technologies is able to deliver on its promise of releasing the first commercial application of holographic storage later this year.

In March 2006, the company -- a 2001 spinoff of Lucent Technologies -- announced that it had produced a storage format called Tapestry that's capable of containing an astonishing 515 gigabits of data per square inch. InPhase also announced that, later this year, the company would release the industry's first real-world holographic drive and media, with a capacity of 300GB on a single disk and a blazing-fast 20MB-per-second transfer rate. Initial costs are estimated to be in the $8,000 to $10,000 price range.

Holographic storage goes beyond the conventional method of writing data on the surface of a medium by using a split laser beam to "stack" data as digital holograms throughout the full three-dimensional depth of a disk. (See InPhase's Technology Tour for an explanation of how it works.)

InPhase has indicated that the second generation of holographic storage will produce disks ranging from 800GB to 1.6 terabytes per disk.

There are clearly no guarantees when it comes to emerging technologies, but the promise of such tremendous storage capacities at such high transfer rates is exciting. Data storage is one of the biggest challenges facing corporate IT departments today, and this need will only increase with time. What's more, this may be the precursor to even more dramatic developments. InPhase has indicated that the second generation of holographic storage will produce disks ranging from 800GB to 1.6 terabytes per disk.It's likely that the cost of holographic storage will be prohibitive for all but the largest corporations and IT departments, and it's entirely possible that for years, holographic storage's usefulness may exist at a level beyond the reach of ordinary consumers. But if InPhase -- currently the only technology company developing holographic storage -- strikes gold, competition and lower prices won't be too far off. And with companies such as Turner Entertainment planning to roll out archival storage systems based on the technology, it would appear that the technology is already underway.

Five Technologies You Need To Know About

•  Introduction

•  Ajax: The new face of the Web•  Core: Intel's next-gen processor architecture•  Holographic storage: The ultimate data storage solution?•  NAND drives: Flash memory hard disks•  AMD-V and VT: The next step in virtualization


NAND Drives

What is it? NAND is a type of flash memory that will be used in solid-state hard drives.

Why is it worth watching? Imagine a solid-state drive that consumes a fraction of the power and weight of a conventional laptop hard disk drive, and won't break when you drop it.

When is it coming? The end of 2006.

Tell me more. In the middle of March at the CeBIT trade show, Samsung displayed a hyper-thin Q30 notebook computer. In and of itself, the 2.5-pound, 0.7-inch thin portable was remarkable, but the true stunner rested inside the machine. Samsung engineers had modded the Q30 with a 32GB solid-state disk (SSD).

This is the sort of technological achievement that feels like an inevitability, albeit in a sci-fi kind of way. The advantages of solid-state drives are clear -- reduced power consumption, faster read/write times, and increased reliability -- so it stands to reason that as flash memory becomes more ubiquitous and affordable, affordable storage drives without moving parts will become a reality.

Samsung claims its SSD reads data three times faster, writes data 1.5 times faster, weighs half as much, and uses 5 percent of the electricity as a comparably sized HDD.

So what is NAND exactly? It's one of two different types of flash memory; the other is NOR. NAND (which stands for Not And) memory excels at quickly reading large files of sequential data and has fast erase/write times. Because of this, NAND flash memory is typically used in ultra-portable MP3 players and the memory cards found inside digital cameras. NOR (Not Or), on the other hand, excels at reading small amounts of non-sequential data. Because of this, it works best in cell phones and other devices that use small amounts of non-linear data.

NAND's strength at reading large files and its ability to rapidly erase and write data makes it ideally suited for solid-state storage. SSD drives based on NAND will essentially function just like gigantic versions of flash memory storage found inside digital cameras. They won't be cheap at first. But in time -- possibly as soon as the end of this year -- they'll be in laptops.

Samsung has already announced the availability of its 32-GB Flash-SSD, designed to replace traditional notebook hard drives. According to the company, its SSD reads data three times faster and writes data 1.5 times faster than a comparably sized 1.8-inch hard drive -- but it weighs only half as much, makes no noise, and uses only 5 percent of the electricity needed to power a hard disk. It's likely we'll see high-end laptops -- including Samsung's own Q30 notebook -- featuring these drives (and perhaps others from competitors like SanDisk) by the end of the year.

Five Technologies You Need To Know About

•  Introduction•  Ajax: The new face of the Web•  Core: Intel's next-gen processor architecture•  Holographic storage: The ultimate data storage solution?

•  NAND drives: Flash memory hard disks•  AMD-V and VT: The next step in virtualization


AMD-V and VT

What is it? Virtualization is a means of emulating multiple instances of an OS -- or multiple operating systems -- on a single computer or server. Previously, virtualization was mostly utilized at the software level, but AMD and Intel are now incorporating virtualization technology into the microprocessor. AMD-Virtualization (formerly code-named Pacifica), also known as AMD-V, is AMD's virtualization technology, while Virtualization Technology (formerly code-named Vanderpool), or VT, is Intel's. AMD-V and VT are not compatible.

Why is it so important? For several years now, virtualization has been a hot topic in the enterprise world, allowing for more secure and open-ended programming and server environments. The technology also has some interesting potential at the consumer level. At either level, the hardware-assisted virtualization being offered by Intel and AMD has the potential to speed up and smooth out software-based virtualization, making the chipmakers' newest battlefield all the more intriguing.When is it coming? VT- and AMD-V-enabled processors are available today.

Tell me more. At its essence, virtualization uses emulation to create a series of "virtual machines" that operate and appear as separate hardware devices, but are in fact running on a single system -- so a single PC can run multiple operating systems or multiple instances of the same operating system at the same time. Fans of science fiction or experimental physics may find an analogy in the concept of the multiverse, a hypothetical model that posits multiple instances of the same universe that together entail all of physical reality. Another analogy is that virtualization is a sort of multitasking, but at the OS level instead of the application level.

Using Microsoft's Virtual PC software, for instance, an IT department can run both Windows XP and Windows 98 on the same system in order to maintain compatibility with older legacy applications while migrating a department to a new operating system.

Increased efficiency, security and uptime are the key benefits delivered by virtualization. With it, you can run five servers at the same time on a single PC. If one of the servers crashes, there's built-in redundancy; the server simply reroutes the incoming requests to one of the other virtual servers.

By encoding the capability for virtualization at the hardware level, Intel and AMD are eliminating the CPU performance hit that computers suffer during emulation.

Virtualization becomes even more fascinating and full of potential when you think past the enterprise level to consumer application. In theory, OS emulation could be used to completely separate your work from your personal files and applications. Or it could be used in conjunction with a multi-core processor and multiple displays to allow multiple users in the same household to perform multiple tasks on the same PC. Three individual members of a family could theoretically perform three separate actions, such as playing two different games and watching television.

Until now, the key problem with emulation has been performance. Running virtual systems is useful, but in practice it can consume a great portion of a CPU's processing cycles, which slows down everything. By encoding the capability for virtualization at the hardware level, Intel and AMD are eliminating the CPU performance hit that desktops and enterprise servers suffer when emulating multiple operating systems or multiple instances of the same OS.

In late 2005, Intel began to incorporate Vanderpool into its line of desktop and Itanium processors, and in early 2006, AMD began incorporating AMD-V into its desktop, mobile, and enterprise-oriented processors. Both technologies are designed to smooth out the emulation process. Which one works best -- both now and in the future -- is what bears watching.

Early buzz is that AMD's approach with AMD-V is more advanced and forward-thinking than Intel's VT, particularly in the realm of memory management. However, until PC developers and Microsoft pull it all together and begin to leverage hardware-based virtualization at higher levels of functionality, we won't know which technology is superior. Recent news about VMware backing a newly formed Virtual Desktop Infrastructure Alliance that includes vendors such as Fujitsu, HP, Hitachi, IBM, NEC, Sun, and Wyse Technology is a welcome indicator that we'll see such functionality later this year. Until then, microprocessor-assisted virtualization is just CPU fanciness.

Five Technologies You Need To Know About

•  Introduction

•  Ajax: The new face of the Web•  Core: Intel's next-gen processor architecture•  Holographic storage: The ultimate data storage solution?•  NAND drives: Flash memory hard disks•  AMD-V and VT: The next step in virtualization

George Jones is a 14-year veteran of technology and gaming journalism. He's been an avid tech-head since the day he first screwed the plastic lid off his Commodore VIC-20.0

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights