"The information superhighway directly connects millions of people, each both a consumer of information and a potential provider. If their exchanges are to be efficient, yet protected on matters of privacy, sophisticated mediators will be required. Electronic brokers can play this important role by organizing markets that promote the efficient production and consumption of information."
from [RESN95]


Although the Internet provides access to huge amounts of information, the information sources, at this moment, are too diverse and too complex for most users to use them to their full extent. "Currently, the World Wide Web (WWW) is the most successful effort in attempting to knit all these different information resources into a cohesive whole that can be interfaced through special documents (called Web pages or hyper/HTML documents). The activity best-supported by this structure is (human) browsing through these resources by following references (so-called hyper links) in the documents." [1]  However, as is pointed out in [DAIG95a], "the WWW & the Internet do not adequately address more abstract activities such as information management, information representation, or other processing of (raw) information".
In order to support these activities with increasingly complex information resources (such as multi-media objects, structured documents, and specialised databases), the next generation of network services infrastructure will have to be interoperable at a higher level of information activity abstraction.
This may be fairly evident in terms of developing information servers and indexes that can interact with one another, or that provide a uniform face to the viewing public (e.g., through the World Wide Web). However, an information activity is composed of both information resources and needs. It is therefore not enough to make resources more sophisticated and interoperable; we need to be able to specify more complex, independent client information processing tasks [2].
In [DAIG95b] an experimental architecture is described that can satisfy both needs as were just described. In this architecture the information search process is divided into three layers: one layer for the client side of information (information searchers), one for the supply or server side of information (information providers), and one layer between these two layers to connect them in the best possible way(s) (the middle layer [3]).
Leslie Daigle is not alone in her ideas: several other parties are doing research into this concept or concepts very similar to it.[4]  Fact is that more and more persons are beginning to realise that the current structure of the Internet, which is more or less divided into two layers or parties (being users and suppliers) is more and more failing to be satisfactory.

[1] Quote taken from [DAIG95a].
[2] Note that this client may be a human user, or another software program.
[3] Other names that are used to name this layer are information intermediaries, information brokers, but also a term such as (intelligent) middleware. Throughout this thesis these terms will be used interchangeably.
[4] For instance, IBM is doing research into this subject in their InfoMarket project.

 previous page  next page  to the chapter's TOC  to the main TOC

"Intelligent Software Agents on the Internet" - by Björn Hermans