The Internet has been hailed as one of the greatest technological advancements of human history. It brings people together, and gives people unprecedented access to information from every part of the world. In the past decade the Internet, and its number of uses and applications, have grown very rapidly. This is putting a large strain on the infrastructure behind the internet. The current infrastructure that supports the Internet will not be adequate to meet the future demands primarily due to the lack of end user bandwidth but also due to the lack of a sweeping drive to improve or redo the underlying infrastructure. In the seventies and eighties, there were many networks being created for academic purposes, such as the ARPANET which spawned what was known as the NSFNET, under the direction of the National Science Foundation, and many other special purpose networks large and small
[...] Arguably, the largest problem facing the Internet infrastructure today is the low amount of available end user bandwidth. End user bandwidth is the bandwidth delivered to an individual, whether they are accessing the Internet from home PC, a large company network, a mobile device, or any other access point. There is plenty of bandwidth at the of the Internet however this performance is generally not delivered to the end user. This low end user bandwidth can be attributed to many factors. [...]
[...] Eventually the NSFNET would be decommissioned due to its inferiority in comparison to competing networks, and the internet as it is known today, without regulation, and with little government funding, would start to dominate the computer network world. In terms of upgrading the internet infrastructure, an unregulated internet creates many problems. Without any central control to mandate a sweeping change there must be programs offering incentives for change. Creating such widespread incentives, in and of itself, creates a large problem. [...]
[...] Within the program, it is hoped that someday their work will directly translate into a faster and more robust Internet experience. Within the Clean Slate program, Dr. McKeown and his team have created what they call The Rate Control Protocol, or RCP for short, which they claim that, once widely utilized, will produce download speeds as high as ten times the current rate The Rate Control Protocol is a congestion control algorithm that allocates limited bandwidth in an efficient manner. [...]
[...] The future of the Internet however, is not all doom and gloom. Many steps are being taken all over the world to ensure that connectivity, speed, and reliability will remain a non issue. On February Japan launched a super high speed satellite to supplement the countries current wired Internet infrastructure. This satellite is capable of providing Internet access to anyone that can afford a small satellite dish and is even reported to be able to provide speeds of up to 1.2 gigabytes per second, much faster than DSL or cable access [10]. [...]
[...] At the turn of the century, almost fifty percent of American adults were using the Internet, while in December of 2007 it was found that almost seventy-five percent of American adults are now using the Internet These dramatic increases in internet usage over the years are very similar when compared with world data on population increase and there is reason to believe as the population increases, even more people will make use of the Internet. This increase in people will only increase the strains put on the infrastructure providing them access. [...]
APA Style reference
For your bibliographyOnline reading
with our online readerContent validated
by our reading committee