... | ... | @@ -6,7 +6,7 @@ |
|
|
|
|
|
* You may experience faulty and unexpected behaviour such as deadlocks, exceptions etc. You may report any bugs you experience in our [feedback](http://isbi.aau.at/ontodebug/feedback). [We](http://isbi.aau.at/ontodebug/team) will try to fix them as soon as possible.
|
|
|
* There are still many features missing that we plan to implement ([see the list of open issues](https://git-ainf.aau.at/interactive-KB-debugging/debugger/issues)). If you have ideas for features that would be nice to have [we](http://isbi.aau.at/ontodebug/team) would be pleased if you [send us your request](http://isbi.aau.at/ontodebug/feedback).
|
|
|
* Please note that this plugin is a BETA ! **If you experience a faulty behaviour of your Protégé instance** (for example your Protege cannot start), it *may* be that this plugin causes the error. You can test this in the following way: delete the file called "*org.exquisite.protege-\<x.y.z>.BETA.jar*" in the "*plugins*" subdirectory of your Protege installation directory and restart Protégé (the *\<x.y.z>* represents the current version such as 0.1.9). In the same way you can also check other plugins if they are responsible for the fault. If this plugin is not causing the error you can later reinstall it.
|
|
|
* **If you experience a faulty behaviour of your Protégé instance** (for example your Protege cannot start), it *may* be that this plugin causes the error. You can test this in the following way: delete the file called "*org.exquisite.protege-\<x.y.z>.BETA.jar*" in the "*plugins*" subdirectory of your Protege installation directory and restart Protégé (the *\<x.y.z>* represents the current version such as 0.1.9). In the same way you can also check other plugins if they are responsible for the fault. If this plugin is not causing the error you can later reinstall it.
|
|
|
|
|
|
<br><br>
|
|
|
# Installation
|
... | ... | @@ -136,7 +136,7 @@ In the image below we see an example with three Entailed Testcases and one Non E |
|
|
##### Saved Test Cases
|
|
|
Next to the set of Acquired Test Cases you have the **Saved Test Cases** showing either manually added or previously saved acquired test cases.
|
|
|
|
|
|
While the Aquired Test Cases lists the answers the user has given in the current debugging session, the Saved Test Cases list either manually added axioms or list acquired axioms from previous debugging session that the user wants to be reused. In order to store Saved Test Cases permanently they are stored as ontology annotations. These changes to the ontology annotations has the effect that the ontology has changed and the user is asked to store the changes once she closes Protégé.
|
|
|
While the Aquired Test Cases lists the answers the user has given in the current debugging session, the Saved Test Cases list either manually added axioms (see next section Test Driven Development) or list acquired axioms from previous debugging session that the user wants to be reused. In order to store Saved Test Cases permanently they are stored as ontology annotations. These changes to the ontology annotations has the effect that the ontology has changed and the user is asked to store the changes once she closes Protégé.
|
|
|
|
|
|
Note that the manual addition of test cases is not possible during a running debugging session.
|
|
|
|
... | ... | @@ -144,7 +144,31 @@ Note that the manual addition of test cases is not possible during a running deb |
|
|
|
|
|
*This screenshot shows a handcrafted non entailed test case next to two saved entailed test cases from a previous debugging session*
|
|
|
|
|
|
###### Test Driven Development
|
|
|
|
|
|
The Saved Test Cases View also enables the ontology engineer a test driven ontology development approach. The underlying principle is similar to the test driven software development: the user specifies test cases for the ontology before, during or after it's development.
|
|
|
|
|
|
By manually creating an entailed test case the user specifies a constraint (axiom) that MUST be satisfied in the intended ontology. It either has to be asserted or inferred by a reasoner.
|
|
|
|
|
|
By manually creating a non-entailed test case the user specifies a statement (axiom) that MUST NOT be satisfied in the intended ontology. That is if the given ontology is correct, this non-entailed test case must not be inferred by the reasoner.
|
|
|
|
|
|
In such a way the ontology engineer can explicitly express his or her intention for the ontology. Also consistent ontologies can be tested by the creation of such test cases. Note that the **underlying ontology will not be modified** by adding test cases manually. Only in the case of a non-entailed positive test case the user will get the option to add this test case to the ontology with an add button. If we start a new debugging session these **manually crafted test cases are taken into account in the query generation** in order to get a repair that will fulfill the intended ontology.
|
|
|
|
|
|
The manual creation of the test case can be initiated by pressing on the add icon ![add](uploads/54618738693f148bbdbea6b523700b62/image.png).
|
|
|
|
|
|
As an example let's create a positive test case that is entailed by the given ontology: ```Student SubClassOf Person``` and a negative test case that is not entailed by the given ontology: ```Person SubClassOf Marsupials```.
|
|
|
|
|
|
The user can now evaluate whether these test cases are fulfilled by pressing on the evaluate icons ![image](uploads/6fe69e6b161d380fc4af2f99ee49e3e8/image.png)
|
|
|
The evaluation will result in a green background highlighting indicating the fulfillment of these test cases.
|
|
|
|
|
|
On the other hand the user can add test cases that contradict the given ontology. So let us add ```Koala SubClassOf Person``` as a new entailed test case and ```Forest SubClassOf Habitat``` as a negative test case.
|
|
|
|
|
|
We now get two new evaluation result types that are also possible: the negative test case ```Forest SubClassOf Habitat``` is highlighted with a red background since this axiom is already asserted in the ontology. ```Koala SubClassOf Person``` should be expected to be red too but here we get a warning icon ![image](uploads/7f51cb68c7eeac6924eb736d64cbaa0c/image.png).
|
|
|
The reason is that the axiom consists of at least one unsatisfiable class (Koala) and thus nothing can be said about this test case.
|
|
|
|
|
|
![image](uploads/4711d2bbb2c7a8dee6d0c471a64546a1/image.png)
|
|
|
|
|
|
*This screenshot shows the four handcrafted test cases with all evaluation outcomes that are possible*
|
|
|
|
|
|
### Step 5: Starting a new Debugging Session
|
|
|
|
... | ... | @@ -385,4 +409,4 @@ These default preferences were used for this tutorial. |
|
|
|
|
|
[1] Uli Sattler,Robert Stevens,Phillip Lord (2013) (I can’t get no) satisfiability. Ontogenesis. http://ontogenesis.knowledgeblog.org/1329
|
|
|
|
|
|
[2] V. Sazonau "Performance Prediction of OWL Reasoners" Master's thesis, The University of Manchester PDF, http://www.cs.man.ac.uk/~sazonauv/SazonauThesis.pdf |
|
|
[2] V. Sazonau "Performance Prediction of OWL Reasoners" Master's thesis, The University of Manchester PDF, http://www.cs.man.ac.uk/~sazonauv/SazonauThesis.pdf |
|
|
\ No newline at end of file |