This 1990 Dover publication of the original 1965 edition serves as a great introduction to "the statistical communication theory", otherwise known as Information Theory, a subject which concerns the theoretical underpinnings of a broad class of communication devices. The exposition here is based on the Shannon's (not Wiener's) formulation or model of the theory, having been initiated in his breakthrough 1948 paper. I purchased this book more than a couple of years ago as a beginning math grad student mainly interested to (quickly and affordably) learn some basics about the subject, without necessarily intending to specialize in it. The text in my opinion should also be accessible to any engineering student with a one or two semester background in real analysis, and a working knowledge of the theory of probability (also summarized at the beginning of the book). Topics discussed include: noiseless coding, discrete memoryless channels, error correcting codes, information sources, channels with memory, and continuous channels. There are some very illuminating historical notes + remarks, and also problem sets at the end of each chapter, with solutions included at the back of the book, making an ideal setting for self-study. Aside from being a great resource for learning the basics however, one sole setback of the book is that all the results and theorems presented therein date from the 50's and early 60's, so one will have to look elsewhere to find out about some of the more recent developments in the field.