[Coco] Learning CPU Architecture and Digital Design

Mark McDougall msmcdoug at iinet.net.au
Tue Feb 19 17:02:44 EST 2013


On 20/02/2013 2:16 AM, T. Franklin wrote:

> Curious, Do these "commercial" cores do a better job of emulating the
> clock timing of the 6809 vs the "open source" cores?

I'm certainly *NOT* making any promises, but...

Over the past 10 months my hobby time has been reduced to zero with the 
arrival of my baby daughter. I used to allocate late-night hours to playing 
with retro stuff, but as anyone who has kids knows, that all goes out the 
window very quickly, and any spare time is spent trying to catch up on lost 
sleep.

It seems now that her sleep is starting to settle and there may be 
opportunity to grab an hour here or there every few nights. So I've been 
scratching around for a project to work on, that can be done in small chunks 
without too much loss of context in my brain.

All this talk of cpu cores has me thinking about the 6809 core that my 
colleague started a few years ago. He got a decent start, with the ALU done, 
all but one addressing mode, and a start on the instruction set itself. A 
lot of the remaining work is cranking the handle and working through the 
instruction set. Has has also been developing a behavioural model in 
parallel to assist in testing/debugging. We also have hardware to connect a 
real 6809 to the DE1 that we've used successfully in the past.

Yesterday I dragged out the code and set up an environment to build it 
again. So far, so good.

Our goal was always cycle-accuracy and eventually add 6309 support. In 
hindsight, it would probably be better to aim for 6309 up front, and let the 
6809 fall out of that. I also had the idea of adding Motorola/Freescale BDM 
(background debug mode) via JTAG which would, in theory, allow hardware 
breakpoints and single-stepping debug capability via standard tools/drivers 
on the PC. Ultimately you may even get GDB for GCC-6809!?!

Anyway, early days yet and again, no promises it'll even get further than it 
is now (although I did fix a bug yesterday!) :P It's a long process of 
development and tedious testing under simulation and then against a real 
device; something that would take months as a full-time paying task. Now 
scatter that over a handful of late nights and you can see there won't be 
any deliverables any time soon.

On a related note, I'm still yet to release my Coco1/2 implementation. 
There's a couple of nigglies to work out that I introduced at the last 
moment, and the fact that it doesn't support writes to the SD card on the 
DE1 (it writes to CF on other hardware), so I figured it's not much use to 
anyone else atm. FWIW I don't really have much interest in supporting DW, so 
it's all HDBDOS with the SD card acting as an IDE drive. Since everyone will 
be playing with Gary's Coco3FPGA, I won't bother releasing it until the core 
is done (never?) :P

Regards,

-- 
|              Mark McDougall                | "Electrical Engineers do it
|  <http://members.iinet.net.au/~msmcdoug>   |   with less resistance!"



More information about the Coco mailing list