Which important Internet developments can currently be observed?
1. The number of people using the Internet is growing rapidly: in the early years of Internet (the eighties and the very beginning of the nineties) most of its users were researchers and (American) public servants. These users were highly educated, were familiar with computers and/or networks, and knew how to use the various Internet services.
However, most of the users that step onto the Internet today are computer novices, they do not necessarily have a very high level of education, and are only partially familiar with the possibilities and techniques of networks in general and the Internet and its services in particular;
2. The number of parties offering services and information on the Internet has grown rapidly: an increasing number of companies, but also other parties such as the government, are starting to offer services on the Internet (usually through the World Wide Web). The amounts of money that is invested in 'Internet presence' and the like have been increasing since 1993 (when businesses and media start to take notice of the Internet); To get an idea of just how rapid the number of hosts [1]  on the Internet is growing: in January 1996, compared to January 1995, the number of hosts had doubled to a staggering number of over 9 million Internet hosts. See [ZAKK96] for further and more detailed information;
3. The growth in the number of people using the Internet is outrunning the increase of available bandwidth: although large investments are being made in faster connections (for instance by replacing coaxial or copper wires by optical fibre) and more powerful backbones [2], the demand for bandwidth is outrunning the supply by miles. User, especially those Internet users that have been working on the Internet since the early days, are complaining about the overcrowdedness of the Internet, which leads to moments where it is nearly impossible to connect to servers or where transferring data takes ages. Internet users will have to live with this 'inconvenience', as it seems most unlikely that the growth of bandwidth will catch up soon with user growth;
4. Since 1995 the World Wide Web is the most popular Internet service: up till 1995 e-mail used to be the most used service on the Internet. However, because it is user-friendly, easy to use, and looks "cool" and attractive, the World Wide Web has taken over first place (in [ZAKK96], the WWW is declared as one of the two technologies of 1995 [3]). Moreover, the WWW can serve as a sort of "umbrella" to put over other Internet services such as FTP or Gopher. Interfacing with a software archive through the WWW is much easier than using FTP itself: the user can usually do most (if not all) of the work with only a mouse and does not need to know the various commands to move around the archive and download (i.e. get) software from it. The same goes for most of the other Internet services. [4]
Through the World Wide Web, users gain access to sheer endless amounts of information and services. This is one of the most important reasons why (big) companies are starting to offer services and information on the WWW: when interesting information is combined cleverly with corporate (commercial) information, a company can gain massive exposure to users (all of which may very well be potential customers) and collect all sorts of information about them (for instance through feed-back given by the users themselves);
5. The emerging technologies of 1995 are mobile code (such as JAVA), Virtual environments (VRML) and collaborative tools.

What influence do these developments have on agent technology and/or how are they linked to it?
One of the most remarkable developments is the high popularity of the World Wide Web. This popularity seems to indicate the need of users for a single, user-friendly interface that hides most (or even all) of the different techniques (actually: services) that are needed to perform certain tasks on the Internet:

"The Web appears to provide what PC owners have always wanted: the capability to point, click, and get what they want no matter where it is. Whereas earlier manifestations of the information revolution bypassed many people who were uncomfortable with computing technology, it appears that the Web is now attracting a large cross section of people, making the universality of information infrastructure a more realistic prospect. If the Web is a first wave (or a second, if the Internet alone is a first), it is likely that further advances in utility and application will follow."
from [NRC94]

Developers of browser software are jumping onto this trend by creating increasingly versatile software packages. For instance, the newest version of Netscape - the most popular browser at this moment - can be used as an WWW browser, but also as a newsreader (for using Usenet) and a mail program (to send and receive e-mail). In fact, the booming popularity of the WWW is largely due to the versatile browsers that have been written for it.
Agents can offer this functionality as well. Better still: they can do it better with improvements such as greater software and hardware independence, extended functionality and flexibility. And they can easily be combined with open standards (such as the three layer model).
The World Wide Web may very well be considered as the first step or stepping-stone towards using more sophisticated technologies (e.g. intelligent software agents) and developing open standards for the Internet.

A growing problem on the Internet at this moment, is the availability of bandwidth. A salient detail in this matter is the fact that currently agents are partly the cause of this. A specific class of agents - information gathering agents, called worms and spiders, which are used to gather information about the contents of the Internet for use in search engines - are consuming quite a lot of bandwidth with their activities. The major reason for this is the fact that for every individual search engine a whole bunch of such agents is gathering information. The gathered information is not shared with other search engines, which wastes considerable amounts of bandwidth. [5]
However, as agent technology evolves this will change. Agents can then be brought into action to help reduce the waste of bandwidth [6]. This reduction is achieved by such things as:
Executing tasks, such as searches, locally (on the remote service) as much as possible.
The agent only sends the result of a search over the Internet to its user;
Using results and experiences of earlier performed tasks to make future executions of the same task more efficient, or even unnecessary.
Serious attempts are being made where agents share gained experience and useful information with others. Many user queries can then be fulfilled without the need to consult (i.e. use) remote services such as search engines;
Using the "intelligence" of agents to perform tasks outside peak-hours, and to spread the load on the Internet more evenly.
Furthermore, agents are better at pinpointing on which hours of the day there is (too) much activity on the Internet, especially since this varies between the days of the week as well.

More on this subject will follow in the next chapter.

[1] A host is a service which offers information and/or Internet services such as an FTP archive or WWW-pages.
[2] Backbones are large-capacity circuits at the heart of a network (in this case the Internet), carrying aggregated traffic over (relatively) long distances.
[3] Sun's JAVA technology was the other one.
[4] It should be noted that the user-friendliness is strongly dependent on the program that is used to navigate the Internet: the so-called browser. The functionality of the various browsers can vary considerably. However, most WWW-users (about 80% at the beginning of 1996) use the popular Netscape browser which offers all of the functionality as is described above.
[5] See section 1.2.2 and 4.3.1.
[6] Agents will help reduce the waste of bandwidth: they will not decrease the need for bandwidth.

 previous page  next page  to the chapter's TOC  to the main TOC

"Intelligent Software Agents on the Internet" - by Björn Hermans