The ’00 7: The decade’s most important tech advances

We select seven technologies that changed the game during the unofficial decade of 2000-2009.

In 2000, the information technology products touted as the best of the year included a 128 kilobits/sec wireless modem, a 3-megapixel digital camera (for $699) and a $2,700 laptop PC that was, at five pounds, among the lightest yet. People were still waiting for Bluetooth products, Intel Itanium chips and Mac OS X – all of which had been promised but not delivered -- Google was just starting to get widespread notice, and Facebook was still four years away.

A lot can happen in 10 years.

Although we’re well aware that the decade still has another year to go – and agree with those who point out that counting forward from the number one really isn’t that hard – we're going to look back anyway across the double-aughts at the technologies that fed a 10-year span of rapid change.

In some cases, whether a given technology was good is a matter of opinion. But all of these have changed the way we live and work. It’s also worth noting that in the age in interoperability, mash-ups and cross-domain systems, many of these technologies work together, or at least depend on each other.

If you don't agree with our choices, sound off in the comments below.

1. GPS devices and applications

In 2000, the Defense Department ended its practice of intentionally degrading Global Positioning System signals for civilian use; until then, only military systems got the most accurate positioning information. Over the next 10 years, improvements in handheld technologies, mapping applications and Web-based interfaces combined with sharp location signals to put GPS devices in cars, smart phones, laptop PCs and peoples’ hands. You can get turn-by-turn driving directions, walking or hiking directions, even plot your public-transportation course. You can even get personalized voice directions -- from a generic computer voice or (usually) an imitated celebrity such as Clint Eastwood or Gollum from the “Lord of the Rings” movies. In a more practical vein, people and organizations are now combining GPS and mapping capabilities with government data made available through Data.gov and other sources to reuse that data in new applications.

2. Smart phones

For government users in particular, it all started with the BlackBerry, which itself started in 1999 as a two-way pager. Research In Motion soon began adding e-mail, texting and Web capabilities, and the “CrackBerry” craze took hold. Meanwhile, Palm’s popular Treo added a color screen and a backlit keyboard in 2002 and Windows CE starting making inroads. Functionality steadily increased as phone-makers added browsers, cameras and keyboards to even casual-use models, and wireless providers delivered better cellular coverage and 3G capabilities. In 2007, Apple released the iPhone, which raised the bar considerably in terms of both user interface and functionality. With the addition of the App Store and the emergence of competitors such as the Verizon Wireless, Droid, users can now carry in their pockets computing devices capable of practically anything.

3. Open source

Ten years ago, the phrase “open source” tended to conjure images of a computing counterculture, people who operated outside the regular business model. Consider Linux, the open-source Unix-like operating system that was viewed as being only for hard-headed coders. (Even if they were actually following the true computing spirit of open information, that was the impression in a lot of quarters.) Today, open source practically is the mainstream. Sun Microsystems went open source a few years ago, Microsoft has opened up some code and both military and civilian agencies are promoting the use of open-source solutions. Open-source software in common use these days includes the Firefox browser, the Open Office suite and Google’s Chrome browser. And, of course, Linux is everywhere, especially on the back ends of Web-facing systems.

4. Web 2.0 technologies

It wasn’t that long ago that most Web pages were about as active as the pages in a book. They just sat there waiting for someone to read them. Developers began adding some functionality with Java, but not all browsers supported it. In the last decade, Microsoft Silverlight and Adobe Flash moved the ball forward, but those products were still tied to a single vendor. However, in recent years, new technologies, particularly Asynchronous JavaScript and XML, have started to bring the dream of an open, interactive Web to reality. Ajax, which actually is a term covering a suite of technologies -- HTML, Cascading Style Sheets, JavaScript, the Extensible Markup Language and XMLHttpRequest – allows JavaScript to communicate requests directly to a Web server and get a response without reloading the page. That ability, coupled with innovations such as the open-source Drupal content management system, has fueled the growth of Web 2.0 and cloud-based applications, and made some of the most effective government Web sites possible. Web 2.0 technologies also have fed the development of social networking sites from Facebook and YouTube to Twitter, which agencies have used to reach out to the masses.

5. Flash memory

Sure, the Defense Department banned employees’ use of flash key drives, but that just goes to show how easy, portable and capacious flash memory is. It costs less than other storage mediums, is non-volatile and more durable, and reads and writes quickly. These days, it’s standard in PDAs, smart phones and cameras, and is used in solid-state drives, which could eventually replace traditional hard drives in PCs, particularly laptops.

6. The spread of WiFi

The IEEE 802.11standards for wireless communications date back far enough – the first WiFi network appeared at Carnegie Mellon University in Pittsburgh in 1994 – that they almost seem quaint. But the 2000s was really the 10-year stretch when WiFi became practically ubiquitous. WiFi networks became common in government, business and academic buildings. WiFi capability was built into everything from laptop PCs and smart phones to cameras and handheld gaming consoles. And although most attempts by cities to provide free municipal WiFi access failed, the spread of hot spots at coffee shops, airports and other locations made up for it. Meanwhile, the WiFi standards moved through 802.11a to b to g and, finally, this year officially to the 802.11n specification. The new standard greatly boosts the data transfer rates and range of WiFi devices, and wireless access point vendors were ready, having been producing “Draft-N” devices for a couple years. In the age of collaborative tools and social networking, talking about WiFi might seem a bit old-school, but that’s only because we take anytime-anywhere Internet access for granted. Ten years ago, we didn’t.

7. XML languages

Languages and schemas built on the Extensible Markup Language are the glue that holds together a lot what happens on the Web and in connected systems. The XML 1.0 specification was officially defined by the World Wide Web Consortium in 1998, establishing rules for tagging data. It soon became essential to the workings of the Web, as vendors adopted the standard and spread its use. Today, there are hundreds of XML languages, including the widely used, open-standard Extensible Business Reporting Language, the Geography Markup Language, Microsoft’s Open Office XML, the Medical Markup Language, and even the Darwin Information Typing Architecture, used mostly in producing technical publications. Bottom line: If it involves storing, indentifying, moving or sharing data, particularly via the Web, XML is in the mix. It’s become so common we might take it for granted, but thepast 10 years in technology might have been very different without it.