Here's a question - what is the benefit of 1080p output from the A20? Even in the case of 1080p media output through a 1080i interface, the conversion should be lossless, no?
So it should be: 1080p(media)->1080i(output-weave)->1080i(input-deweave)->1008p(display). In other words, you can fit a 1080p24 (24hz) signal entirely within a 1080i(60hz) output signal with no loss of signal, assuming your TV is 1080p and can do a non-cheesy de-interlacing of an input 1080i signal, from what I understand.
Unfortunately that is a very large assumption. Virtually no TV can do "non-cheesy" deinterlacing of 1080i. Most don't even try. Of the ones that do, most of them end up displaying 1080p/60 rather than 1080p/24 or 1080p/48, which is not good at all.
It gets even worse when you have a source that was shot on hi-def video rather than film. No that many TVs can do a good job of distinguishing between 1080i video and 1080i film, and, of those that can, not many will do proper per-pixel motion-adaptive deinterlacing on video sources.
It's worse still if you have to deal with 50Hz as well as 60Hz sources, but as far as I know all hi-def disc content will be 60Hz even in Europe, so we are at least spared that.
Oh, and I second all the points about HDMI versions. HDMI 1.3
may become important for
lossless surround audio formats, but that's all. HDCP is perfectly happy over HDMI 1.1. 1.3 will also eventually become important for resolutions beyond 1920x1080 or video signal formats with > 8 bits per colour channel, but that's years away, and any present-day players won't be able to decode the discs by then.