Touching hands
3.3.4Issues related to (using) Brokers & some concluding remarks
Navigation:

Back to the previous page Previous page

To the Table of Contents of this chapter Chapter ToC

To the Table of Contents of this paper Paper ToC

Brokering and intermediary services can be offered by both human as well as electronic intermediaries (e.g. software agents). Agent enthusiasts like to picture a world where human intermediaries are no longer needed, as agents will be able to do all the work for you. The more technological pessimistic minded people, on the other hand, think that humans will remain in the driver's seat when it comes to offering, seeking, and brokering information and services in a sensible, intelligent manner.
The future situation will most likely be a cross-over of these two opposites as electronic and human brokers have quite different (often complementary) qualities and abilities. By combining both types' strengths, interesting forms of co-operation can be established. In the short term, this will most likely lead to a situation where the human broker takes care of the intelligence in the process, whereas the software counterpart (agents) will do laborious work like gathering and updating (mostly meta-)information about sources and seekers;

"Electronic brokers will be required to permit even reasonably efficient levels and patterns of exchanges. Their ability to handle complex, albeit mechanical, transactions, to process millions of bits of information per second, and to act in a demonstrably even-handed fashion will be critical as [the electronic market place] develops."
from [RESN95]

From that point on, these electronic 'bots will evolve from a cunning assistant to a helpful partner in the brokering process (although it will take years before we get there). This subject will be dealt with in more detail in the next section.

The most obvious users of brokering services are humans and human parties/organisations. However, brokers could also offer valuable services to software entities/parties, such as software agents. It seems logical to relieve human participants in the information market from the burden to go out and find out all by themselves who is offering and/or seeking certain information and services; why then, would we still want to burden software (like software agents) with the exact same task? If brokers can be of help to humans, they can very well be of help to software programs with comparable needs and problems. When brokers are able to provide up-to-date information of the sources and services, agents will no longer have to keep track of which agents are (still) online and/or available. This will enable them to issue a request without specifying a specific receiving party: (just about) any party that can satisfy the request will do.

"[The Internet] is dynamic, and agents [will] presumably [be] coming online and going offline all the time. By inserting facilitators into the picture, the burden is lifted from the individual agents. Facilitators can keep track of what agents know or, as a second level of abstraction, may maintain locations of other facilitators, categorized by ontology or discipline. [In the future] commercial agent facilitators may act as brokers, striking cost-per-transaction deals with agents or other facilitators to satisfy requests or debit and credit accounts accordingly."
from [PLAI97]

To make this kind of system work, an expressive (knowledge representation) language is needed with which information and requests can be sufficiently well expressed. To keep things as open and as simple as possible, it is strongly preferred that this language will become the standard for such tasks, in the same way as HTML is the standard for Web content.1
Speaking of standards: to make agents fit in well with the whole framework as seen so far, they will also need to adhere to certain standards and protocols. It is unfeasible to make parties in the information market account for any possible type of agent they might have to deal with. Therefore, agents should respond and react similarly (regardless of their internal code and structure) to certain requests or questions. However, the standards or protocols chosen to be used in this process will have to be flexible enough to provide for issues/developments that are unforeseen at present time.

Besides standards, another concern is related to the question how parties will find out which broker best to use (in general, or in some particular situation). We expect that in the short term this will be a process of trail & error, and word of mouth. Just as with search engines, brokers and intermediaries will ultimately be judged (and appreciated) by the results they deliver, and - possibly - by the price they ask for doing so. In the longer term, tools might become available which help to determine which broker to use at a certain moment, for a certain task, at certain costs, etcetera.


1= A candidate language, specifically developed for this task, could be KQML. However, this language has yet to make it into any major commercial product. Implementations have primarily been restricted to technology demonstrations in proprietary configurations. See [PLAI97] for more detailed information about KQML.
On to the next page Next page
To the Hermans' Home Page Home Page
Chapter 3 - From Internet to Online Market Place "Desperately Seeking: Helping Hands and Human Touch" -
by Björn Hermans