Re: New scraper and piggy bank failure with 64 or more scraped items.

From: Steve Dunham <dunhamsteve_at_gmail.com>
Date: Fri, 13 Jan 2006 13:48:54 -0800

On 1/13/06, Eric Miller <em_at_w3.org> wrote:
>
> quick 2 sec look indicates the problem might be the '&' character in
> line 700 of the file. this may be the problem?

It's not the '&' character. Both the working and non-working file
contain it, but I took it out just to make sure, and the problem still
occurs.

I find it interesting that this happens at exactly a power of 2 items,
like a hash table is going wiggy or something. The two files are the
same, except for one resource at the end of the non-working file. I
arrived at the two files by reducing a broken file until it worked.
The problem also occurred when I tried adding a different resource
from the original set as the 64th. If I reduce the data to just name
and type, it

One other data point, when I try to do "export", I get an error 500
with "We don't handle AnonymousNodes".

This could be a file size issue, too. If I run "doesnt.n3" or the
full set, "full.n3", through cwm first, they both load into piggy
bank, but every record after the 63rd is missing.
Received on Fri Jan 13 2006 - 21:48:37 EST

This archive was generated by hypermail 2.3.0 : Thu Aug 09 2012 - 16:39:18 EDT