one domain often faced with this term is the world of xml and related "standards". lot's of them out there, some of them really stable and useful and even are interoperable (e.g. xml 1.0, xslt 1.0, xpath 1.0) itself.
by the way just using xml does not gurantee interoperablility for your data. this is only available if application behavior is addressed by a related standard. xml related standards try to achive this (e.g. svg) often fail or they are difficult to use because they missing essential features the specific user domain requires and corresponding tool vendors / application provides add them in a tool specific way. or the standard is too complex to implement a 100% complaint application (e.g. xlink).
DITA for example a new OASIS standard / information architecture to maintain mainly techdoc related features more and more faces with those issues. this standard has customization in mind, means specialization to specific needs is part of the design but there are of course still limitation and there a good reasons for those limitations in general.
the initial standard was not feature complete (means essential requirements were missed in user point of view) and therefore vendors /consultants / end user adding specific non complaint features for their specific needs which often results in missing the goal of interoperability.
why is DITA still successful?
to understand this you have two things to consider:
- keep in mind that just using xml does not solve your interoperability goals without any additional effort
- keep in mind that fully inoperable data is not always what you need. regular business cases often working well with a inoperable subset or predefined transformation on demand.
and that is the key feature if you think about organization specific information models.
No comments:
Post a Comment