Isis,
An article with n-lines will have a article size of 128 * n bytes.
The total number of lines will be your 5GB divided by 128. The number of articles needed will be the total number of lines divided by then number of lines per article.
The total number of lines of a posting will only be affected by the header overhead (whether it's posted with power post or any other program), the header being everything that's not the encoded part (yenc, mime, or any other encoding). Like i stated, this is very small percentage.
About text-message: the nature of usenet is that anyting posted on usenet (being readable text or a binary posting) is posted as text. If you don't believe me: download a article without decoding it and view it in notepad.
About your claim about size: I am sorry to disappoint you. I do know how usenet works, how newsreaders work (I date from the very early years of usenet, from before the sophisticated newsreaders there are today, hell, in the begining we had to manually decode the files!).
Your are right about the number of conformations. You are right about the storage needed to
store the headers. However, conformation time might effect a poster (server time), but does not effect needed storage. Yes, more storage is needed to store more headers (and more headers need to be downloaded). But it does not effecting the needed storage for the complete posting. You wont still will need about 5GB of server storage whether you use 20.000 or 40.000 articles (the 40.000 articles will need about 2-4MB extra storage serverside).
If you want you can compare it like this: How much more storage will you need if you make your rar's 25MB in stead of 50MB? That's not a whole lot. Only the rar-overhead.
Finally, I never said that binary postings don't belong to usenet. I only stated that usenet was not
designed for it and that the limit for article size originates from the original use. I certainly do have no problem at all with binary postings itself, we started this site years ago to make usenet, and binary postings in particular, more known and more easily accessible to the general public!
Although the original standard does not dictate a maximum number of lines (see:
http://www.freesoft.org/CIE/RFC/Orig/rfc1036.txt), whether you like it or not, currently it's a fact not many usenet servers support large article sizes (and throughout history gigabyte has been a odd player on the field not fully following nntp standards and/or agreements). When not all servers propagate large article sizes your large posting will be only available on server that do.
NZB and indexing site's have been invented because of the large number of headers in groups. Since they existed you don't need to download headers in a conventional newsreader if you don't want to. You can simply search online at an indexing site to find where you are looking for.
To summarize: You are right if you only take header data in account. If you look at the total needed storage of a posting the extra header-data of posting is not that much, and if you use NZB's anyway not relevant at all.