ODF documents are made up of an archived package containing computer-readable elements and conform to the industry-recognised OASIS ODF specification.
However, not all ODF processing applications implement the specification fully, faults exist in some implementations and in some cases, the specification is not absolutely clear.
From the user perspective, an ODF document may simply not look the same after being opened and saved in another office application which causes confusion and poor user experience.
Getting to the root cause of a user issue can be complex as there are many applications providing create and edit functionality for ODF documents with varying levels of implementation and quality.
Sometimes, issues are caused by missing fonts or styles and application defaults which can cause differences in the look of a document even if the integrity of the core information is retained.
The Autotests framework was created by Jos Van Den Oever to enable a simple way to process a document across a variety of applications and compare the round-trip outputs.
This processing entails the source document being transmitted to a 'factory' which is running an ODF application, the document is opened and saved automatically and returned to the Autotests server.
Documents are queued and processed through every available combination of factory registered in parallel so depending on the workload, results can be available in seconds across tens of application / operating system / language combinations.
The output of the Autotests framework is a measurable and repeatable process. Therefore reports from the Autotest framework are usually accepted by ODF software developers and vendors as legitimate issues and have been used to resolve issues and have fixes applied to a range of applications, both Open Source and proprietary.
As the Autotests platform is Open Source, you can download and run it yourself. If you need help with this, the ODF Plugfest community can assist you, see the support page for details.
To start off with, you need a problem to solve. You may have a theory based on knowledge of implementations, you may be a developer or involved with standards.
You could also be a user who has had experiences with ODF documents loosing content or formatting after transmission.
Maybe you just want to experiment, try advanced features of your application, process the document through various applications and see what breaks.
Whichever case, once you have identified an issue to investigate, you can use one of the following Autotest facilities to test your documents.
Test sets can also be created to enable a batch of tests around a particular subject to be run on documents in future.
The Autotest framework provides three test facilities:
- Process and visually compare a document
- Process and digitally evaluate a document
- Process and digitally evaluate a code fragment
Please note that any document uploaded to the Autotests platform is publicly accessible so make sure you don't submit documents containing confidential or sensitive information.
Process and visually compare a document
The simplest facility provides a user the ability to upload a document, then automatically compare the output from any two registered applications side-by-side.
The process is extremely easy to use and quickly highlights issues to be addressed in greater detail. Documents are submitted through a simple drag-and-drop interface and can be accessed by any level of user. Note that this is a visual comparison, if the underlying document is damaged this will not show.
Try it out: Upload a document and test it.
Process and digitally evaluate a document
Digitally evaluating a document requires a basic understanding of XML and the XPath language. This facility executes tests on the document supplied and reports on the status of the tests in a manageable way.
XPath is used to construct tests which are carried out on the source document at the point of submission of the test to ensure that they are valid on the source document.
Following processing at the Factory, the XPath tests are carried out on all of the returned document versions and results presented.
Tests can be cloned to be improved by others and users can vote on the usefulness of each other's tests.
You can find this facility by clicking on the 'Make test' link of the visual comparison facility, by creating a test from scratch, or cloning an existing test.
Process and digitally evaluate a code fragment
The code fragment facility is similar to the document evaluation facility but it enables the user to manually edit the XML to enable a finer-grained tests by overriding the styles.xml or content.xml components of the file.
To use this facility, follow the link to the digital document evaluation above, and add your content within the form.