Processing real-world information is messy. Real-world information is even more problematic with Legal Language, where the amount of information and degree of precision encoded in language is extremely high. Every single word carries meaning and is often placed carefully. While Legal Language can be difficult for us mere mortals to read and understand, lawyers who craft contracts often do so with great intention. To properly understand Legal Language, we require methods to bridge the gap between the messy realm of ordinary discourse and unambiguous definitions and reasoning structures, which are easier for computers to handle. To represent knowledge in an exact and structured way, philosophers and computer scientists have long worked with “ontologies.” An ontology is a vocabulary and set of rules for constructing assertions.
There are ontologies for all kinds of specialties, including several for legal knowledge. The most popular in the legal realm is LKIF, the Legal Knowledge Interchange Format. It’s an “interchange format” because many different formalisms can be translated into LKIF and then back into a different formalism.
LKIF uses the Web Ontology Language, called OWL in a strained effort at an acronym. In OWL, you can define classes, name instances of classes, and declare that they have properties that relate them to other classes or instances.
OWL isn’t really a “language” in the ordinary sense or even the usual computer sense. It’s a set of rules, but it doesn’t define the syntax. There are several different ways of expressing it, and they don’t look anything like each other, but they all allow the same set of assertions. For example, with OWL you can say that the class “citizen” is a subclass of “natural person” or that John Doe is a member of the class “citizen.” Entities can have properties relating them to other entities; John Doe might have the “citizen of” property, with the value “United States.” In the case of dual citizenship, he would have two “citizen of” properties. OWL allows negative assertions, for instance, that an entity of the class “corporation” cannot be a member of the class “citizen.”
LKIF incorporates both general terms about the real world, such as persons, actions, and beliefs, and terms specific to law, such as mandates, decisions, and legal speech acts. It can be useful in a number of ways.
Software can turn information expressed in LKIF into a database, storing information about laws, contracts, organizations, and so on in a way that’s easy to query. For instance, it could enumerate the parties to a contract and the obligations of each one.
Lawyers and judges can use LKIF-encoded information to draw inferences and make arguments in a case. Semantic reasoning software can answer questions by following a chain of logic based on the information.
Don’t get too excited, OWL-based ontologies have their limits. They deal in black-and-white facts and are not very good at expressing considerations which carry weight but aren’t definitive. You can’t say in LKIF what “the preponderance of the evidence” or “probable cause” means. This becomes increasingly important in the oft-disputed “best efforts” vs. “reasonable efforts” disputes that some lawyers are so fond of arguing about on the Internets. What LKIF can do is describe the facts and legal principles that lie behind many complex concepts. For instance, the description of a case in LKIF might say “Jane Roe presented eyewitness testimony” and the legal database might say “sworn eyewitness testimony is admissible evidence,” but it can say what constitutes sufficient evidence only when there’s an explicit rule that doesn’t leave room for discretion.
An LKIF document can be a huge help in making sense of the cross-references that abound in legal documents. If a legislature has amended a law multiple times, its content might be spread over multiple books and can be hard to figure out even when all gathered in one document. Translated into the legal ontology, it’s laid out so that all the exceptions and revisions fit into a logical whole.
However, at Legal Robot, we found that LKIF and OWL did not fully contain the conditional reasoning structures and probabalistic properties we needed to capture the complexity of contract language. This is especially apparent when working with the “strategic ambiguity” that makes some lawyers so deviously happy. So, we created our own ontology, taking parts from some of the logic models and formalized ontologies of philosophers long past. In many ways, ours is also simplified by discarding constructs that are unnecessary for the fairly narrow domain of reasoning we currently cover: contracts.