What Are Ontologies, and Why Do We Need Them? ?· ONTOLOGIES What Are Ontologies, and Why Do We Need…

  • Published on

  • View

  • Download


O N T O L O G I E SWhat Are Ontologies, andWhy Do We Need Them?B. Chandrasekaran and John R. Josephson, Ohio State UniversityV. Richard Benjamins, University of AmsterdamTHEORIES IN AI FALL INTO TWObroad categories: mechanism theories andcontent theories. Ontologies are content the-ories about the sorts of objects, properties ofobjects, and relations between objects that arepossible in a specified domain of knowledge.They provide potential terms for describingour knowledge about the domain.In this article, we survey the recent devel-opment of the field of ontologies in AI. Wepoint to the somewhat different roles ontolo-gies play in information systems, natural-language understanding, and knowledge-based systems. Most research on ontologiesfocuses on what one might characterize asdomain factual knowledge, because knowl-ede of that type is particularly useful in nat-ural-language understanding. There is an-other class of ontologies that are importantin KBSone that helps in sharing know-eldge about reasoning strategies or problem-solving methods. In a follow-up article, wewill focus on method ontologies.Ontology as vocabularyIn philosophy, ontology is the study of thekinds of things that exist. It is often said thatontologies carve the world at its joints. InAI, the term ontology has largely come tomean one of two related things. First of all,ontology is a representation vocabulary, oftenspecialized to some domain or subject matter.More precisely, it is not the vocabulary as suchthat qualifies as an ontology, but the concep-tualizations that the terms in the vocabularyare intended to capture. Thus, translating theterms in an ontology from one language toanother, for example from English to French,does not change the ontology conceptually. Inengineering design, you might discuss theontology of an electronic-devices domain,which might include vocabulary that describesconceptual elementstransistors, operationalamplifiers, and voltagesand the relationsbetween these elementsoperational ampli-fiers are a type-of electronic device, and tran-sistors are a component-of operational ampli-fiers. Identifying such vocabularyand theunderlying conceptualizationsgenerallyrequires careful analysis of the kinds of objectsand relations that can exist in the domain.In its second sense, the term ontology issometimes used to refer to a body of knowl-edge describing some domain, typically acommonsense knowledge domain, using arepresentation vocabulary. For example,CYC1 often refers to its knowledge repre-sentation of some area of knowledge as itsontology.In other words, the representation vocab-ulary provides a set of terms with which todescribe the facts in some domain, while thebody of knowledge using that vocabulary isa collection of facts about a domain. How-ever, this distinction is not as clear as it mightfirst appear. In the electronic-device exam-ple, that transistor is a component-of opera-tional amplifier or that the latter is a type-ofelectronic device is just as much a fact aboutTHIS SURVEY PROVIDES A CONCEPTUAL INTRODUCTIONTO ONTOLOGIES AND THEIR ROLE IN INFORMATIONSYSTEMS AND AI. THE AUTHORS ALSO DISCUSS HOWONTOLOGIES CLARIFY THE DOMAINS STRUCTURE OFKNOWLEDGE AND ENABLE KNOWLEDGE SHARING.20 1094-7167/99/$10.00 1999 IEEE IEEE INTELLIGENT SYSTEMS.its domain as a CYC fact about some aspectof space, time, or numbers. The distinctionis that the former emphasizes the use ofontology as a set of terms for representingspecific facts in an instance of the domain,while the latter emphasizes the view of ontol-ogy as a general set of facts to be shared.There continues to be inconsistencies inthe usage of the term ontology. At times, the-orists use the singular term to refer to a spe-cific set of terms meant to describe the entityand relation-types in some domain. Thus, wemight speak of an ontology for liquids orfor parts and wholes. Here, the singularterm stands for the entire set of concepts andterms needed to speak about phenomenainvolving liquids and parts and wholes. When different theorists make different pro-posals for an ontology or when we speak aboutontology proposals for different domains ofknowledge, we would then use the plural termontologies to refer to them collectively. In AIand information-systems literature, however,there seems to be inconsistency: sometimes wesee references to ontology of domain andother times to ontologies of domain, bothreferring to the set of conceptualizations forthe domain. The former is more consistent withthe original (and current) usage in philosophy.Ontology as content theoryThe current interest in ontologies is the lat-est version of AIs alternation of focus be-tween content theories and mechanism the-ories. Sometimes, the AI community getsexcited by some mechanism such as rule sys-tems, frame languages, neural nets, fuzzylogic, constraint propagation, or unification.The mechanisms are proposed as the secretof making intelligent machines. At othertimes, we realize that, however wonderful themechanism, it cannot do much without agood content theory of the domain on whichit is to work. Moreover, we often recognizethat once a good content theory is available,many different mechanisms might be usedequally well to implement effective systems,all using essentially the same content.2AI researchers have made several attemptsto characterize the essence of what it meansto have a content theory. McCarthy andHayestheory (epistemic versus heuristic dis-tinction),3 Marrs three-level theory (infor-mation processing, strategy level, algorithmsand data structures level, and physical mech-anisms level),4 and Newells theory (Knowl-edge Level versus Symbol Level)5 all grap-ple in their own ways with characterizingcontent. Ontologies are quintessentially con-tent theories, because their main contributionis to identify specific classes of objects andrelations that exist in some domain. Ofcourse, content theories need a representa-tion language. Thus far, predicate calculus-like formalisms, augmented with type-ofrelations (that can be used to induce classhierarchies), have been most often used todescribe the ontologies themselves.Why are ontologiesimportant?Ontological analysis clarifies the structureof knowledge. Given a domain, its ontologyforms the heart of any system of knowledgerepresentation for that domain. Withoutontologies, or the conceptualizations thatunderlie knowledge, there cannot be a vocab-ulary for representing knowledge. Thus, thefirst step in devising an effective knowledge-representation system, and vocabulary, is toperform an effective ontological analysis ofthe field, or domain. Weak analyses lead toincoherent knowledge bases.An example of why performing goodanalysis is necessary comes from the field ofdatabases.6 Consider a domain having sev-eral classes of people (for example, students,professors, employees, females, and males).This study first examined the way this data-base would be commonly organized: stu-dents, employees, professors, males, andfemale would be represented as types-of theclass humans. However, some of the prob-lems that exist with this ontology are that stu-dents can also be employees at times and canalso stop being students. Further analysisshowed that the terms students and employeedo not describe categories of humans, but areroles that humans can play, while terms suchas females and males more appropriately rep-resent subcategories of humans. Therefore,clarifying the terminology enables the ontol-ogy to work for coherent and cohesive rea-soning purposes.Second, ontologies enable knowledgesharing. Suppose we perform an analysis andarrive at a satisfactory set of conceptualiza-tions, and their representative terms, for somearea of knowledgefor example, the elec-tronic-devices domain. The resulting ontol-ogy would likely include domain-specificterms such as transistors and diodes; generalterms such as functions, causal processes,and modes; and terms that describe behaviorsuch as voltage. The ontology captures theintrinsic conceptual structure of the domain.In order to build a knowledge representationlanguage based on the analysis, we need toassociate terms with the concepts and rela-tions in the ontology and devise a syntax forencoding knowledge in terms of the conceptsand relations. We can share this knowledgerepresentation language with others whohave similar needs for knowledge represen-tation in that domain, thereby eliminating theneed for replicating the knowledge-analysisprocess. Shared ontologies can thus form thebasis for domain-specific knowledge-repre-sentation languages. In contrast to the previ-ous generation of knowledge-representationlanguages (such as KL-One), these lan-guages are content-rich; they have a largenumber of terms that embody a complex con-tent theory of the domain. Shared ontologies let us build specificknowledge bases that describe specific situ-ations. For example, different electronic-devices manufacturers can use a commonvocabulary and syntax to build catalogs thatdescribe their products. Then the manufac-turers could share the catalogs and use themin automated design systems. This kind ofsharing vastly increases the potential forknowledge reuse.Describing the worldWe can use the terms provided by thedomain ontology to assert specific proposi-tions about a domain or a situation in adomain. For example, in the electronic-device domain, we can represent a fact abouta specific circuit: circuit 35 has transistor 22as a component, where circuit 35 is aninstance of the concept circuit and transistor22 is an instance of the concept transistor.Once we have the basis for representingpropositions, we can also represent knowl-edge involving propositional attitudes (suchas hypothesize, believe, expect, hope, desire,and fear). Propositional attitude terms takepropositions as arguments. Continuing withthe electronic-device domain, we can assert,for example: the diagnostician hypothesizesor believes that part 2 is broken, or thedesigner expects or desires that the powerplant has an output of 20 megawatts. Thus,an ontology can represent beliefs, goals,JANUARY/FEBRUARY 1999 21.hypotheses, and predictions about a domain,in addition to simple facts. The ontology alsoplays a role in describing such things as plansand activities, because these also requirespecification of world objects and relations.Propositional attitude terms are also part ofa larger ontology of the world, useful espe-cially in describing the activities and prop-erties of the special class of objects in theworld called intensional entitiesforexample, agents such as humans who havemental states.Constructing ontologies is an ongoingresearch enterprise. Ontologies range inabstraction, from very general terms thatform the foundation for knowledge repre-sentation in all domains, to terms that arerestricted to specific knowledge domains. Forexample, space, time, parts, and subparts areterms that apply to all domains; malfunctionapplies to engineering or biological domains;and hepatitis applies only to medicine.Even in cases where a task might seem tobe quite domain-specific, knowledge repre-sentation might call for an ontology that des-cribes knowledge at higher levels of gener-ality. For example, solving problems in thedomain of turbines might require knowledgeexpressed using domain-general terms suchas flows and causality. Such general-leveldescriptive terms are called the upper ontol-ogy or top-level ontology. There are manyopen research issues about the correct waysto analyze knowledge at the upper level. Toprovide some idea of the issues involved,Figure 1 excerpts a quote from a recent callfor papers.Today, ontology has grown beyond philos-ophy and now has many connections to infor-mation technology. Thus, research on ontol-ogy in AI and information systems has had toproduce pragmatically useful proposals fortop-level ontology. The organization of a top-level ontology contains a number of problems,similar to the problems that surround ontol-ogy in philosophy. For example, many ontolo-gies have thing or entity as their root class.However, Figure 2 illustrates that thing andentity start to diverge at the next level.For example, CYCs thing has the subcat-egories individual object, intangible, and rep-resented thing; the Generalized UpperModels7 (GUM) um-thing has the subcate-gories configuration, element, and sequence;Wordnets8 thing has the subcategories liv-ing thing and nonliving thing, and Sowasroot T has the subcategories concrete, pro-cess, object, and abstract. (Natalya FridmanNoys and Carol Hafners article discussesthese differences more fully.9) Some of thesedifferences arise because not all of theseontologies are intended to be general-pur-pose tools, or even explicitly to be ontolo-gies. Another reason for the differences isthat, in principle, there are many differenttaxonomies.Although differences exist within ontolo-gies, general agreement exists between on-tologies on many issues: There are objects in the world. Objects have properties or attributes thatcan take values. Objects can exist in various relations witheach other. Properties and relations can change overtime. There are events that occur at differenttime instants. There are processes in which objects par-ticipate and that occur over time. The world and its objects can be in dif-ferent states. Events can cause other events or states aseffects. Objects can have parts.The representational repertoire of objects,relations, states, events, and processes doesnot say anything about which classes of theseentities exist. The modeler of the domainsmakes these commitments. As we move froman ontologys top to lower taxonomic levels,commitments specific to domains and phe-nomena appear. For modeling objects onearth, we can make certain commitments. Forexample, animals, minerals, and plants aresubcategories of objects; has-life(x) and con-tains-carbon(x) are object properties; andcan-eat(x, y) is a possible relation betweenany two objects. These commitments are spe-cific to objects and phenomena in this do-main. Further, the commitments are not arbi-trary. For them to be useful, they shouldreflect some underlying reality.There is no sharp division between do-main-independent and domain-specific on-tologies for representing knowledge. Forexample, the terms object, physical object,device, engine, and diesel engine all describeobjects, but in an order of increasing domainspecificity. Similarly, terms for relationsbetween objects can span a range of speci-ficity, such as connected, electrically-con-nected, and soldered-to.Subtypes of concepts. Ontologies generallyappear as a taxonomic tree of conceptual-izations, from very general and domain-independent at the top levels to increasinglydomain-specific further down in the hierar-chy. We mentioned earlier that differentontologies propose different subtypes of evenvery general concepts. This is because, as arule, different sets of subcategories will resultfrom different criteria for categorization. Two,among many, alternate subcategorizations ofthe general concept object are physical andabstract, and living and non-living. In somecultures and languages, words for objectshave gender, thus creating another top-levelclassification along the gender axis. We caneasily think of additional subcategorizationsbased on other criteria. The existence of alter-nate categorizations only becomes more acuteas we begin to model specific domains ofknowledge. For example, we can subcatego-rize causal process into continuous and dis-crete causal processes along the dimensionof how time is represented, and into mechan-ical, chemical, biological, cognitive, andsocial processes along the dimension of thekinds of objects and relations involved in thedescription.In principle, the number of classificationcriteria and distinct subtypes is unlimited,because the number of possible dimensionsalong which to develop subcategories can-22 IEEE INTELLIGENT SYSTEMSFigure 1. Call for papers for a special issue on temporal parts for The Monist, An International Quarterly Journal ofGeneral Philosophical Inquiry. This quote suggests that ontology has always been an issue of deep concern in philoso-phy and that the issues continue to occupy contemporary philosophers.On the one hand there are entities, such as processes and events, which have temporalparts. On the other hand there are entities, such as material objects, which are always pre-sent in their entirety at any time at which they exist at all. The categorical distinction betweenentities which do, and entities which do not have temporal parts is grounded in commonsense. Yet various philosophers have been inclined to oppose it. Some have defended anontology consisting exclusively of things with no temporal parts. Whiteheadians have favoredontologies including only temporally extended processes. Quine has endorsed a four-dimen-sional ontology in which the distinction between objects and processes vanishes and everyentity comprises simply the content of some arbitrarily demarcated portion of space-time.One further option, embraced by philosophers such as David Lewis, accepts the oppositionbetween objects and processes, while still finding a way to allow that all entities have bothspatial and temporal parts..not be exhaustively specified. Often, this factis not obvious in general-purpose ontologies,because the top levels of such ontologiescommit to the most commonly useful sub-types. However, domain-specific ontologiescan contain categorizations along dimensionsthat are usually outside the general ontology.Task dependence of ontologies. How task-dependent are ontologies? Presumably, thekinds of things that actually exist do notdepend on our goals. In that sense, ontologiesare not task-dependent. On the other hand,what aspects of reality are chosen for encod-ing in an ontology does depend on the task.For example, in the domain of fruits, wewould focus on particular aspects of reality ifwe were developing the ontology for theselection of pesticides; we would focus onother aspects of reality if we were develop-ing an ontology to help chefs select fruits forcooking. In ontologies for engineering appli-cations, categorizing causal processes intothose that do, and that do not, produce dan-gerous side effects might be useful. Designengineers and safety analysts might find thisa very useful categorization, though it isunlikely to be part of a general-purpose ontol-ogys view of the causal process concept.Practically speaking, an ontology is un-likely to cover all possible potential uses. Inthat sense, both an ontology for a domain anda knowledge base written using that ontologyare likely to be more appropriate for certainuses than others and unlikely to be sharableacross widely divergent tasks. This is, by now,a truism in KBS research and is the basicinsight that led to the current focus on the rela-tionship between tasks and knowledge types.Presuppositions or requirements can be asso-ciated with problem-solving methods for dif-ferent tasks so that they can capture explicitlythe way in which ontologies are task-depen-dent. For example, a method might have a pre-supposition (or assumption10) stating that itworks correctly only if the ontology allowsmodeling causal processes discretely. There-fore, assumptions are a key factor in practicalsharing of ontologies.Technology for ontologysharingThere have been several recent attempts tocreate engineering frameworks for construct-ing ontologies. Michael R. Genesereth andRichard E. Fikes describe KIF (KnowledgeInterchange Format), an enabling technologythat facilitates expressing domain factualknowledge using a formalism based on aug-mented predicate calculus.11 Robert Nechesand his colleagues describe a knowledge-shar-ing initiative,12 while Thomas R. Gruber hasproposed a language called Ontolingua to helpconstruct portable ontologies.13 In Europe, theCommonKADS project has taken a similarapproach to modeling domain knowledge.14These languages use varieties of predicatecalculus as the basic formalism. Predicatecalculus facilitates the representation ofobjects, properties, and relations. Variationssuch as situational calculus introduce timeso as to represent states, events, and pro-cesses. If we extend the idea of knowledgeto include images and other sense modali-ties, we might need radically different kindsof representation. For now, predicate calcu-lus provides a good starting point for ontol-ogy-sharing technologies.Using a logical notation for writing andsharing ontologies does not imply any com-mitment to implementing a related knowl-edge system or a related logic. We are simplytaking a knowledge-level5 stance in describ-ing the knowledge system, whatever themeans of implementation. In this view, wecan ask of any intelligent system, even oneimplemented as a neural network, Whatdoes the system know?Use of ontologiesIn AI, knowledge in computer systems isthought of as something that is explicitly rep-resented and operated on by inference pro-cesses. However, that is an overly narrowview. All information systems traffic in knowl-edge. Any software that does anything usefulcannot be written without a commitment to amodel of the relevant worldto entities, prop-erties, and relations in that world. Data struc-tures and procedures implicitly or explicitlymake commitments to a domain ontology. Itis common to ask whether a payroll systemknows about the new tax law, or whether adatabase system knows about employeesalaries. Information-retrieval systems, digi-tal libraries, integration of heterogeneousinformation sources, and Internet searchengines need domain ontologies to organizeinformation and direct the search processes.For example, a search engine has categoriesand subcategories that help organize thesearch. The search-engine community com-monly refers to these categories and subcate-gories as ontologies.Object-oriented design of software sys-tems similarly depends on an appropriatedomain ontology. Objects, their attributes,and their procedures more or less mirroraspects of the domain that are relevant to theapplication. Object systems representing auseful analysis of a domain can often bereused for a different application program.Object systems and ontologies emphasizedifferent aspects, but we anticipate that overtime convergence between these technolo-gies will increase. As information systemsmodel large knowledge domains, domainontologies will become as important in gen-eral software systems as in many areas of AI.In AI, while knowledge representation per-vades the entire field, two application areasin particular have depended on a rich body ofknowledge. One of them is natural-languageunderstanding. Ontologies are useful in NLUin two ways. First, domain knowledge oftenplays a crucial role in disambiguation. A well-designed domain ontology provides the basisfor domain knowledge representation. Inaddition, ontology of a domain helps identifythe semantic categories that are involved inunderstanding discourse in that domain. Forthis use, the ontology plays the role of a con-cept dictionary. In general, for NLU, we needJANUARY/FEBRUARY 1999 23ThingIntangible RepresentedIndividual objectCYCThingNonlivingLivingWordnetUm-ThingElement SequenceConfigurationGUMThingProcess AbstractObjectConcreteSowa'sFigure 2. Illustration of how ontologies differ in their analyses of the most general concepts..both a general-purpose upper ontology and adomain-specific ontology that focuses on thedomain of discourse (such as military com-munications or business stories). CYC, Word-net,8 and Sensus15 are examples of sharableontologies that have been used for languageunderstanding.Knowledge-based problem solving is thesecond area in AI that is a big consumer ofknowledge. KBPS systems solve a variety ofproblemssuch as diagnosis, planning, anddesignby using a rich body of knowledge.Currently, KBPS systems employ domain-specific knowledge, which is often sufficientfor constructing knowledge systems that tar-get specific application areas and tasks. How-ever, even in specific application areas,knowledge systems can fail catastrophicallywhen they are pushed to the edge of the capa-bility of the domain-specific knowledge. Inresponse to this particular shortcoming,researchers have proposed that problem-solving systems need commonsense knowl-edge in addition to domain-specific knowl-edge. The initial motivation for CYC was toprovide such a body of sharable common-sense knowledge for knowledge-based sys-tems. There is a similar need for developingdomain-specific knowledge. Thus, ontology-based knowledge-base development providesa double advantage. The ontologies them-selves are sharable. With these ontologies,we can build knowledge bases using thestructure of conceptualizations to encodespecific pieces of knowledge. The knowledgebases that we develop using these ontologiescan be shared more reliably, because the for-mal ontology that underlies them can helpclarify the representations semantics.Information systems and NLU systemsneed factual knowledge about their domainsof discourse. The inferences they make areusually simple. Problem-solving systems, incontrast, engage in complex sequences ofinferences to achieve their goals. Such sys-tems need to have reasoning strategies thatenable them to choose among alternative rea-soning paths. Ontology specification inknowledge systems has two dimensions: Domain factual knowledge providesknowledge about the objective realities inthe domain of interest (objects, relations,events, states, causal relations, and soforth). Problem-solving knowledge providesknowledge about how to achieve variousgoals. A piece of this knowledge might bein the form of a problem-solving methodspecifyingin a domain-independentmannerhow to accomplish a class ofgoals.Most early research in KBPS mixed fac-tual and problem-solving knowledge intohighly domain-specific rules, called domainknowledge. As research progressed, it be-came clear that there were systematic com-monalities in reasoning strategies between24 IEEE INTELLIGENT SYSTEMSRelated workThe field of ontology attracts an interdisciplinary mix of researchers,both from academia and industry. Here we give a selection of referencesthat describe related ontology work. Because the literature is vast, a com-plete list is impossible. For an extensive collection of (alphabeticallyordered) links to ontological work, including proceedings and events, seehttp://www.cs.utexas.edu/users/mfkb/related.html.Special issues on ontology N. Guarino and R. Poli, The Role of Ontology in the InformationTechnology, Intl J. Human-Computer Studies, Vol. 43, Nos. 5/6,Nov.-Dec. 1995, pp. 623965. G. Van Heijst, A.T. Schreiber, and B.J. Wielinga, Using ExplicitOntologies in KBS Development, Intl J. Human-Computer Studies,Vol. 46, Nos. 2/3, Feb.-Mar. 1997, pp. 183292. M. Uschold and A. Tate, Putting Ontologies to Use, KnowledgeEng. Rev., Vol. 13, No. 1, Mar. 1998, pp. 13.Ontology development J. Benjamin et al., Ontology Construction for Technical Domains,Proc. EKAW 96: European Knowledge Acquisition Workshop, Lec-ture Notes in Artificial Intelligence No. 1076, Springer-Verlag,Berlin, 1996, pp. 98114. W.N. Borst and J.M. Akkermans, Engineering Ontologies, Intl J.Human-Computer Studies, Vol. 46, Nos. 2/3, Feb.-Mar. 1997, pp.365406. A. Farquhar, R. Fikes, and J. Rice, The Ontolingua Server: A Toolfor Collaborative Ontology Construction, Intl J. Human-ComputerStudies, Vol. 46, No. 6, June 1997, pp. 707728. A. Gomez-Perez, A. Fernandez, and M.D. Vicente, Towards aMethod to Conceptualize Domain Ontologies, Working Notes 1996European Conf. Artificial Intelligence (ECAI 96) Workshop on Onto-logical Eng., ECCAI, Budapest, Hungary, 1996, pp. 4152. T.R. Gruber, Towards Principles for the Design of Ontologies Usedfor Knowledge Sharing, Intl J. Human-Computer Studies, Vol. 43,Nos. 5/6, Nov.-Dec. 1995, pp. 907928. R. Studer, V.R. Benjamins, and D. Fensel, Knowledge Engineering,Principles, and Methods, Data and Knowledge Eng., Vol. 25, Mar.1998, pp. 161197. M. Uschold and M. Gruninger, Ontologies: Principles, Methods,and Applications, Knowledge Eng. Rev., Vol. 11, No. 2, Mar. 1996,pp. 93155. Natural-language ontology J.A. Bateman, B. Magini, and F. Rinaldi, The Generalized UpperModel, Working Papers 1994 European Conf. Artificial Intelligence(ECAI 94) Workshop on Implemented Ontologies, 1994, pp. 3445;http://www.darmstadt.gmd.de/publish/komet/papers/ecai94.ps. K. Knight and S. Luk, Building a Large-Scale Knowledge Base forMachine Translation, Proc. AAAI 94, AAAI Press, Menlo Park,Calif. 1994. G.A. Miller, Wordnet: An Online Lexical Database, Intl J.Lexicography, Vol. 3, No. 4, 1990, pp. 235312. P.E. Van de Vet, P.H. Speel, and N.J.I. Mars, The Plinius Ontology ofCeramic Materials, Working Papers 1994 European Conf. ArtificialIntelligence (ECAI 94) Workshop on Implemented Ontologies,ECCAI, Amsterdam, 1994, pp. 187206.Ontologies and information sources Y. Arens et al., Retrieving and Integrating Data from Multiple Infor-mation Sources, Intl J. Intelligent and Cooperative InformationSystems, Vol. 2, No. 2, 1993, pp. 127158. S. Chawathe, H. Garcia-Molina, and J. Widom, Flexible ConstraintManagement for Autonomous Distributed Databases, IEEE DataEng. Bulletin, Vol. 17, No. 2, 1994, pp. 2327. S. Decker et al., Ontobroker: Ontology-Based Access to Distributedand Semi-Structured Information, Semantic Issues in MultimediaSystems, R. Meersman et al., eds., Kluwer Academic Publishers,Boston, 1999..goals of similar types. These reasoningstrategies were also characterized by theirneed for specific types of domain factualknowledge. It soon became clear that strate-gic knowledge could be abstracted andreused.With few exceptions,16, 17 the domain fac-tual knowledge dimension drives the focusof most of the AI investigations on ontolo-gies. This is because applications to languageunderstanding motivates much of the workon ontologies. Even CYC, which was origi-nally motivated by the need for knowledgesystems to have world knowledge, has beentested more in natural-language than inknowledge-systems applications.KBPS RESEARCHERS REALIZEDthat, in addition to factual knowledge, thereis knowledge about how to achieve problem-solving goals. In fact, this emphasis on meth-ods appropriate for different types of prob-lems fueled second-generation research inknowledge systems.18 Most of the KBPScommunitys work on knowledge represen-tation is not well-known to the generalknowledge-representation community. In thecoming years, we expect an increased focuson method ontologies as a sharable knowl-edge resource. AcknowledgmentsThis article is based on work supported by theOffice of Naval Research under Grant N00014-96-1-0701. We gratefully acknowledge the support ofONR and the DARPA RaDEO program. Any opin-ions, findings, and conclusions or recommenda-tions expressed in this publication are those of theauthors and do not necessarily reflect the views ofONR. Netherlands Computer Science ResearchFoundation supported Richard Benjamins withfinancial support from the Netherlands Organiza-tion for Scientific Research (NWO).References1. D.B. Lenat and R.V. Guha, Building LargeKnowledge-Based Systems: Representationand Inference in the CYC Project, Addison-Wesley, Reading, Mass., 1990.2. B. Chandrasekaran, AI, Knowledge, and theQuest for Smart Systems, IEEE Expert, Vol.9, No. 6, Dec. 1994, pp. 26.3. J. McCarthy and P.J. Hayes, Some Philo-sophical Problems from the Standpoint ofArtificial Intelligence, Machine IntelligenceVol. 4, B. Meltzer and D. Michie, eds., Edin-burgh University Press, Edinburgh, 1969, pp.463502.4. D. Marr, Vision: A Computational Investiga-tion into the Human Representation and Pro-cessing of Visual Information, W.H. Freeman,San Francisco, 1982.5. A. Newell, The Knowledge Level, Artifi-cial Intelligence, Vol. 18, 1982, pp. 87127.6. R. Wieringa and W. de Jonge, Object Iden-JANUARY/FEBRUARY 1999 25 S. Luke et al., Ontology-Based Web Agents, Proc. First Intl Conf.Autonomous Agents, ACM Press, New York, 1997, pp. 5966; http://www.cs.umd.edu/projects/plus/SHOE/ 1997. S.T. Polyak et al., Applying the Process Interchange Format (PIF) toa Supply Chain Process Interoperability Scenario, Proc. 1998 Euro-pean Conf. Artificial Intelligence (ECAI 98) Workshop on Applica-tions of Ontologies and Problem-Solving Methods, ECCAI, Brighton,England, 1998, pp. 8896. G. Wiederhold, Intelligent Integration of Information, J. IntelligentInformation Systems, Vol. 6, Nos. 2/3, 1996. G. Wiederhold and M. Genesereth, The Conceptual Basis for Medi-ation Services, IEEE Intelligent Systems, Vol. 12, No. 5, Sept./Oct.1997, pp. 3847.Ontologies and knowledge management A. Abecker et al., Toward a Technology for Organizational Memories,IEEE Intelligent Systems, Vol. 13, No. 3, May/June 1998, pp. 4048. V.R. Benjamins and D. Fensel, The Ontological Engineering Initia-tive (KA)2, Formal Ontology in Information Systems, N. Guarino,ed., IOS Press, Amsterdam, 1998, pp. 287301. M.S. Fox, J. Chionglo, and F. Fadel, A Common-Sense Model of theEnterprise, Proc. Industrial Eng. Research Conf., Inst. for IndustrialEngineers, Norcross, Ga., 1993, pp. 425429. Manual of the Toronto Virtual Enterprise, tech. report, EnterpriseIntegration Laboratory, Dept. of Industrial Eng., Univ. of Toronto,Toronto, 1995. M. Uschold et al., The Enterprise Ontology, Knowledge Eng. Rev.,Vol. 13, No. 1, Mar. 1998.Task and method ontologies D. Fensel et al., Using Ontologies for Defining Tasks, Problem-Solving Methods, and Their Mappings, Knowledge Acquisition,Modeling, and Management, E. Plaza and V.R. Benjamins, eds.,Springer-Verlag, Berlin, 1997, pp. 113128. J.H. Gennari et al., Mapping Domains to Methods in Suppport ofReuse, Intl J. Human-Computer Studies, Vol. 41, No. 3, Sept. 1994,pp. 399424. A. Tate, Roots of SPARShared Planning and Activity Represen-tation, Knowledge Eng. Rev., Vol. 13, No. 1, Mar. 1998, pp.121128. Y.A. Tijerino and R. Mizoguchi, Multis II: Enabling End-Usersto Design Problem-Solving Engines via Two-Level Task Ontolo-gies, Proc. EKAW 93: Seventh European Workshop on Know-ledge Acquisition for Knowledge-Based Systems, Lecture Notes inArtificial Intelligence No. 723, Springer-Verlag, 1993, pp.340359.Ontology workshops Applications of Ontologies and Problem-Solving Methods, ECAI 98(European Conf. AI), http://delicias.dia.fi.upm.es/WORKSHOP/ECAI98/index.html Building, Maintaining, and Using Organizational Memories, ECAI98, http://www.aifb.uni-karlsruhe.de/WBS/ECAI98OM/ Formal Ontologies in Information Systems (FOIS 98), http://krr.irst.itc.it:1024/fois98/program.html Intelligent Information Integration, ECAI 98, http://www.tzi.de/grp/i3/ws-ecai98/ Sharable and Reusable Components for Knowledge Systems, KAW98 (Workshop on Knowledge Acquisition, Modeling, and Manage-ment), http://ksi.cpsc.ucalgary.ca/KAW/KAW98/KAW98Proc.html Ontological Engineering, AAAI Spring Symp. Series, Stanford,Calif., 1997, http://www.aaai.org/Symposia/Spring/1997/sss-97.html Problem-Solving Methods, IJCAI 97 (Intl Joint Conf. AI), http://www.aifb.uni-karlsruhe.de/WBS/dfe/PSM/main.html Ontological Engineering, ECAI 96, http://wwwis.cs.utwente.nl:8080/kbs/EcaiWorkshop/homepage.html Practical Aspects of Ontology Development, AAAI 96 Sharable and Reusable Ontologies, KAW 96, http://ksi.cpsc. ucalgary.ca/KAW/KAW96/KAW96Proc.html Sharable and Reusable Problem-Solving Methods, KAW 96, http://ksi.cpsc.ucalgary.ca/KAW/KAW96/KAW96Proc.html.tifiers, Keys and Surrogates: Object Identi-fiers Revisited, Theory and Practice ofObject Systems (TAPOS), Vol. 1, No. 2, 1995,pp. 101114.7. J.A. Bateman, B. Magini, and F. Rinaldi, TheGeneralized Upper Model, Working Papers1994 European Conf. Artificial Intelligence(ECAI 94) Workshop on Implemented Ontolo-gies, 1994, pp. 3445; http://www.darmstadt.gmd.de/publish/komet/papers/ecai94.ps.8. G.A. Miller, Wordnet: An Online LexicalDatabase, Intl J. Lexicography, Vol. 3, No.4, 1990, pp. 235312.9. N. Fridman Noy and C.D. Hafner, The Stateof the Art in Ontology Design, AI Magazine,Vol. 18, No. 3, 1997, pp. 5374.10. D. Fensel and V.R. Benjamins, The Role ofAssumptions in Knowledge Engineering,Intl J. Intelligent Systems, Vol. 13, No. 7,1998, pp. 715747. 11. M.R. Genesereth and R.E. Fikes, KnowledgeInterchange Format, Version 0.3, KnowledgeSystems Lab., Stanford Univ., Stanford, Calif.1992.12. R. Neches et al., Enabling Technology forKnowledge Sharing, AI Magazine, Vol. 12,No. 3, 1991, pp. 3656.13. T.R. Gruber, A Translation Approach toPortable Ontology Specifications, Knowl-edge Acquisition, Vol. 5, 1993, pp. 199220.14. G. Schreiber et al., CommonKADS: AComprehensive Methodology for KBSDevelopment, IEEE Expert, Vol. 9, No. 6,Dec. 1994, pp. 2837.15. K. Knight and S. Luk. Building a Large-Scale Knowledge Base for Machine Transla-tion, Proc. Am. Assoc. Artificial Intelligence,AAAI Press, Menlo Park, Calif., 1994.16. D. Fensel et al., Using Ontologies for Defin-ing Tasks, Problem-Solving Methods andTheir Mappings, Knowledge Acquisition,Modeling and Management, E. Plaza and V.R.Benjamins, eds., Springer-Verlag, New York,1997, pp. 113128.17. R. Mizoguchi, J. Van Welkenhuysen, and M.Ikeda, Task Ontology for Reuse of ProblemSolving Knowledge, Towards Very LargeKnowledge Bases, N.J.I. Mars, ed., IOS Press,Amsterdam, 1995.18. J.M. David, J.P. Krivine, and R. Simmons,Second Generation Expert Systems, Springer-Verlag, 1993.B. Chandrasekaran is professor emeritus, asenior research scientist, and the director of theLaboratory for AI Research (LAIR) in the Depart-ment of Computer and Information Science atOhio State University. His research focuses onknowledge-based systems, causal understanding,diagrammatic-reasoning, and cognitive architec-tures. He received his BE from Madras Universityand his PhD from the University of Pennsylvania,both in electrical engineering. He was Editor-in-Chief of IEEE Expert from 1990 to 1994, and heserves on the editorial boards of numerous inter-national journals. He is a fellow of the IEEE,AAAI, and ACM. Contact him at the Laboratoryfor AI Research, Ohio State Univ., Columbus, OH,43210; chandra@cis.ohio-state.edu; http://www.cis.ohio-state.edu/lair/.John R. Josephson is a research scientist and theassociate director of the Laboratory for AIResearch in the Department of Computer andInformation Science at Ohio State University. Hisprimary research interests are knowledge-basedsystems, abductive inference, causal reasoning,theory formation, speech recognition, perception,diagnosis, the logic of investigation, and the foun-dations of science. He received his BS and MS inmathematics and his PhD in philosophy, all fromOhio State University. He has worked in severalapplication domains, including medical diagnosis,medical test interpretation, diagnosis of engineeredsystems, logistics planning, speech recognition,molecular biology, design of electromechanicalsystems, and interpretation of aerial photographs.He is the coeditor with Susan Josephson of Abduc-tive Inference: Computation, Philosophy, Tech-nology, Cambridge Univ. Press, 1994. Contact himat the Laboratory for AI Research, Ohio StateUniv., Columbus, OH, 43210; jj@cis.ohio-state.edu; http://www.cis.ohio-state.edu/lair/.Richard Benjamins is a senior researcher and lec-turer at the Department of Social Science Infor-matics at the University of Amsterdam. Hisresearch interests include knowledge engineering,problem-solving methods and ontologies, diagno-sis and planning, and AI and the Web. He obtainedhis BS in cognitive psychology and his PhD in arti-ficial intelligence from the University of Amster-dam. Contact him at the Dept. of Social ScienceInformatics, Univ. of Amsterdam, Roetersstraat15, 1018 WB Amsterdam, The Netherlands;richard@swi.psy.uva.nl; http://www.swi.psy.uva.nl/usr/richard/home.html.26 IEEE INTELLIGENT SYSTEMSMarch/April IssueIntelligent AgentsGuest Editor Jim Hendler of DARPA will present articles discussing development techiques for and thepractical application of intelligent agents, which might be the solution to handling the data andinformation explosion brought about by the Internet. Scheduled topics include Agents for the masses: Is it possible to develop sophisticated agents simple enoughto be practical? Extempo Systems interactive characters, who engage, assist, and entertain peopleon the Web AgentSofts efforts at commercializing intelligent agent technologyIntelligent Systems will also continue its coverage of the ontologies track, started in this issue, andthe track on vision-based vehicle guidance, which began in the November/December 1998 issue.IEEE Intelligent Systems covers the full range of intelligent system developments for the AIpractitioner, researcher, educator, and user.IEEE Intelligent Systems


View more >