Re: [RT] Moving Piggy Bank forward...

From: Stefano Mazzocchi <stefanom_at_mit.edu>
Date: Thu, 28 Jul 2005 16:38:41 -0700

Jeen Broekstra wrote:
> Stefano Mazzocchi wrote:
>
>>> Unfortunately, there is currently no production-ready custom
>>> inferencer for the in-memory store (the available custom
>>> inferencer only operates on MySQL databases). There is some raw
>>> code on a new custom inferencer that uses SeRQL queries as the
>>> rule format, but we are short a number of hands to make that
>>> thing work properly. An alternative is perhaps OntoTexts' OWLIM
>>> package, which is an adapted in-memory store that can do simple
>>> OWL-Lite entailment.
>>
>>
>>
>> If you can point us to the code, maybe we can fill up those needed
>> hands.
>>
>> We *really* need to perform basic OWL-lite (or even tiny?)
>> entailment.

Jeen,

thanks for your comments. See mine below.

> Ok, the OWLIM package can be found from the Sesame plugin section:
>
> http://www.openrdf.org/contrib.jsp
>
> It is a separate download but from what I understand installation and
> use is dead-easy. Have a look, this might already solve your problem.

I'm on it.

> The SeRQL-based custom inferencer code was originally done by Alan
> Cyment (see
> http://www.openrdf.org/forum/mvnforum/viewthread?thread=252),
> unfortunately he had other commitments which made it impossible for
> him to finish the project. It is currently still in CVS, module
> 'openrdf', branch 'serql_custom_inferencing'. The basic idea is there
> but it is far from robust and will need tweaking before it even runs.
> But it does parse SeRQL queries and translates them to inference rules
> that can be applied.
>
> The idea behind this inferencer was to have a storage-independent part
> and a storage-specific part (so one for memory, one for RDMBS, etc.),
> to allow sharing of code while at the same time enabling
> storage-specific optimizations to the inferencing algorithm.
>
> To make things more complicated: we are currently working hard on
> Sesame 2.0, which is a complete revision of Sesame's internal and
> external APIs. The main goals of this revision are better transaction
> support, context support, and smoother integration into other
> projects. Part of this is a rethink of the inferencer design: we are
> hoping to decouple the inferencer from the store a bit more, so that
> dynamic switching from non-inferencing to inferencing (and vice
> versa), or replacing one inferencer with another becomes possible. All
> this is currently work in progress: the goal is to have a first beta
> release of Sesame 2.0 in early September.

ok

I'll play with the OWLim reasoner and let you know. In the meanwhile,
please keep up posted on how we can help on the sesame 2.0 effort.

>>> As an aside: basic inferencing is still on the ToDo list for the
>>> native store. We are also awaiting a number of third party
>>> contributions which will hopefully significantly improve native
>>> store performance (better indexing). I'll keep you informed on
>>> progress if you want.
>>
>>
>>
>> Please do, we do want to be more active in helping up shaping this
>> space in the future, but we are unsure of what/how to do it. More
>> collaboration would just be great for us and yes, we are not just
>> those who demand for a fix, we do it ourselves if we know we are
>> not stepping on other people's toes.
>
>
> Fixes and other contributions are _always_ most welcome :)
>
> The story is currently a bit complex since we are doing all this API
> revision stuff. Depending on how urgently you need things you can
> either decide to develop against the Sesame 2.0 code base (the APIs
> are there but only one store implementation - in-memory), or against
> the Sesame 1.x code base (this will probably be quicker to get running
> but on the other hand you'll have to do a lot of refactoring work when
> the switch to Sesame 2.0 is made). Using the 2.0 APIs would be a big
> advantage to us as well by the way, since it will hopefully give us
> early feedback on the design from you guys. But it's up to you of course!

I'm not very keen on writing an application on top of a moving-target
api. I will be very happy to start porting it over Sesame 2.0 once you
tell us that you feel that the API is reasonably solid, but not before than.

I'll keep working on the 1.x codebase for now, knowing that I might end
up throwing some of the code away (which is not a big deal).

>> Ah, and I hate web forums with a passion ;-)
>
>
> *sigh* there's no pleasing some people... Heh, but seriously, I'm fine
> with discussing Sesame-related stuff here instead if that is ok with
> the rest of the list-inhabitants.

I would most gladly welcome it.

In case you want us to host a sesame-specific mail list, we'll be more
than welcome to do that too.

-- 
Stefano Mazzocchi
Research Scientist                 Digital Libraries Research Group
Massachusetts Institute of Technology            location: E25-131C
77 Massachusetts Ave                   telephone: +1 (617) 253-1096
Cambridge, MA  02139-4307              email: stefanom at mit . edu
-------------------------------------------------------------------
Received on Thu Jul 28 2005 - 23:35:29 EDT

This archive was generated by hypermail 2.3.0 : Thu Aug 09 2012 - 16:39:18 EDT