There will always be a gap in technology when something new and significantly widespread is first introduced into society. Electricity, indoor plumbing, the United States Postal Service: these are all new forms of technologies that, once developed and recognized as highly functional and well worth everyone having, took time to spread. These more traditional technologies took time to expand to the rest of the nation, but could eventually be found in every American home and business.
The digital divide, however, began with the introduction of telecommunications, when technology was first being developed to connect people almost instantly over a wide expanse, telephones being the major beginning factor. The speed at which people could now communicate, and just the idea of far-reaching communication in general, was seen as a luxury, and the integration of telephones was slower than previous traditional technologies.
Out of this spurred the divide in internet access, the most common form of the digital divide when referencing the modern day issue. Since internet access originated from telephone cables, the population without land-lines or a telephone provider were already behind the curve. And as internet technologies improved, moving from telephone providers to internet service providers, the gap was widened. Each step in improved internet technologies left an exponentially larger gap in the digital divide. In 1995 the Commerce Department published its first look at the digital divide, finding stark racial, economic, and geographic gaps between those who could get online and those who could not. And between 1991 and 1996 the number of personal computers in the United States skyrocketed from 300,000 to over ten million. With prominent news outlets fueling the discussion of the digital divide, the problem became a full blown issue, finding its way into legislation within the White House and through newly created government agencies such as the National Telecommunications & Information Administration, or NTIA.
But by the early 2000’s the prices of computers and online access were falling, quieting the need to deal with the digital divide and shifting the terminology instead to “digital inclusion.” In 2002 Cnet.com reported that more than half of all Americans were online, even the less well off. The report stated that between 1991 and 2001, the use of the internet for households earning less than $15,000 a year was growing at a 25% annual rate. This shift led to much less noise surrounding the issue, so much so that the government downgraded the NTIA and its budget, ultimately closing the organization.
However, developing technologies eventually continued the gap that is the digital divide. With the development of high-speed wired internet and second-class wireless internet, there became a hierarchy in access to the internet. While millions are still offline completely, many internet users rely solely on second-class wi-fi across mobile devices for internet access, preventing them the ease and communication ability of an actual computer with a high-speed wired connection. Not to mention that with unemployment rates, it is still difficult for lower class citizens to afford access, and with many schools unable to provide computers for students to use effectively, many individuals are still left not knowing how to use technology.