[Coco] Minted buffer size

Gene Heskett gheskett at wdtv.com
Sat Mar 7 18:32:16 EST 2015



On Thursday 05 March 2015 13:04:06 Luis Antoniosi (CoCoDemus) wrote:
> Yes it will.

Unfortunately, but probably with only the last half hours data loss, I 
did find the memory limit, several seconds after I had typed the last 
character and was perusing a previous printout for the next name I 
wanted to note down as I was breaking some code stanza's up into smaller 
pieces so I could reuse them in the event one typed bootlink ?.

However, when the "memory full - press break" pops up, there is no 
recovery other than tapping the reset button.  I must have pressed break 
50 times trying the get a backspace delete in before it popped up again.

So now, at least until minted grows a block mapping process similar to 
how myram does it, which on my coco3 would give it access to about 1.7 
megabytes of memory,  I am stuck with splitting it into smaller pieces, 
then putting "use nextfile" statements to chain it to the next piece as 
its being assembled.

Either that, or write it to /x1 & do an os9copy to get it to this machine 
where I've editors that can handle 100 megabyte files.  As that will 
require EOL conversions that are also parts of some of its strings, I 
could really bollix up stuff doing that.  So that is a last resort.

So, I'll break it into about 4 parts and put in the "use filename" 
statements.


> Minted has a very simple memory manager. It allocates 54K for the
> buffer. Then malloc will allocates blocks inside this buffer using
> 2bytes for size + data buffer. To get the next chunk, add the size to
> the address returned by malloc. The first bit is the used/free flag.
> malloc can allocate a maximum string of 32K. Inside the malloc, then
> it creates the string structure with pointers for previous/next row,
> length. etc.
>
> The malloc is smart enough to split  and coalesce memory chunks.

It must have done that after I quit typing for 20 secs or so.  I had been 
adding 50 byte or so comments to code not previously commented.  And had 
done that to about 100, maybe 120 lines of code when it barfed.

The source file I started out with is $6B4E long,  eg 27,470 bytes, and I 
had added about 5 kilobytes worth of comments when the OOM popped up.  
So theoretically, I should have had around 20 kilobytes of play room.

But Yogi Berra's famous comment comes immediately to mind, the one about 
theory vs practice. ;-)

As we don't have a head or tail function in os9, I expect to have to send 
it up here to do that breakup unless someone has a better idea.

Any ideas for the splitter, short of writing a quick and dirty one 
myself?  Ideally, set it for 200 line individual files, BUT when it has 
hit the 200 lines, continue until it finds an rts, then dump what it has 
& reset the line counter to cut out the next 200+ line piece.

FWIW, I did at one point, attempt to load the .lst file from this 
package, which is 49some kilobytes long, and got an OOM from that, but 
it still allowed a show of the file (although I did not scroll to the 
end to see if it was all there, and could still do a clean ctl e.  So I 
suspect the real issue may be wasted space from all the fragmentation of 
the internal memory that I created with my comment additions.

Would a frequent save, quit and reload prolong this?  Let it reorganize 
memory with each rerun?

> Luis Felipe Antoniosi

[...]

Cheers, Gene Heskett
-- 
"There are four boxes to be used in defense of liberty:
 soap, ballot, jury, and ammo. Please use in that order."
-Ed Howdershelt (Author)
Genes Web page <http://geneslinuxbox.net:6309/gene>


More information about the Coco mailing list