#6 in Information theory books
Use arrows to jump to the previous/next product
Reddit mentions of An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)
Sentiment score: 7
Reddit mentions: 10
We found 10 Reddit mentions of An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics). Here are the top ones.
Buying options
View on Amazon.comor
Dover Publications
Specs:
Height | 8.5 Inches |
Length | 5.5 Inches |
Number of items | 1 |
Release date | November 1980 |
Weight | 0.771617917 Pounds |
Width | 0.75 Inches |
You are in a very special position right now where many interesing fields of mathematics are suddenly accessible to you. There are many directions you could head. If your experience is limited to calculus, some of these may look very strange indeed, and perhaps that is enticing. That was certainly the case for me.
Here are a few subject areas in which you may be interested. I'll link you to Dover books on the topics, which are always cheap and generally good.
Basically, don't limit yourself to the track you see before you. Explore and enjoy.
i recently read Pierce's An Introduction to Information Theory and was pleased. while it's not recent it's a good intro, if thats' what you're looking for. also it's a dover edition so it's priced very low.
They may not be the best books for complete self-learning, but I have a whole bookshelf of the small introductory topic books published by Dover- books like An Introduction to Graph Theory, Number Theory, An Introduction to Information Theory, etc. The book are very cheap, usually $4-$14. The books are written in various ways, for instance the Number Theory book is highly proof and problem based if I remember correctly... whereas the Information Theory book is more of a straightforward natural-language summary of work by Claude Shannon et al. I still find them all great value and great to blast through in a weekend to brush up to a new topic. I'd pair each one with a real learning text with problem sets etc, and read the Dover book first quickly which introduces the reader to any unfamiliar terminology that may be needed before jumping into other step by step learning texts.
From the ground up, I dunno. But I looked through my amazon order history for the past 10 years and I can say that I personally enjoyed reading the following math books:
An Introduction to Graph Theory
Introduction to Topology
Coding the Matrix: Linear Algebra through Applications to Computer Science
A Book of Abstract Algebra
An Introduction to Information Theory
If you're interested in the more computers-and-signal-processing side of neuroscience, you'll need a bunch of math. If you're interested, check out this book (http://www.amazon.com/An-Introduction-Information-Theory-Symbols/dp/0486240614/ref=sr_1_1?ie=UTF8&qid=1371748392&sr=8-1&keywords=information+theory). I read it after going to college, so it may have a smidge of calculus in there, but at least the beginning (which is all the interesting stuff) is simple enough to not need it.
Information theory is one of those math topics that makes you rethink more than just math, hence the recommendation.
The Dover information theory book is also the same price.
Now I'm hunting for other Dover books I may want. Thanks!
Hmmm for a biologist I am not really that sure. It is a branch of mathematics mostly statistical... Shannons original book/expanded paper is available here and I have read this but they are very engineery if you know what I mean.
But I think that kind of illustrates the point... creationist types seem to see some profound truth in information theory... but it is really a field developed to deal with the transmission of messages in a general way and dealing with errors in the transmission.The entire reason for it is to come up with ways to minimise errors in message transmission and that is all EEs care about... that they get the same message at the receiver that was sent by the transmitter.
To expand this to biology I suppose you coudl roughly say that the message is the gemone being transmitted down the generations. The the way errors are dealt with is entirely different... selection takes care of the "bad" messages. But evolution as far as I know does nto have the goal to replicate the same information... but rather the opposite.
NINJA EDIT : Clauge E Shannon shoudl be a lot better know for his contributionto humanity.
For a very good read on entropy that requires only high school math check out:
http://www.amazon.com/Introduction-Information-Theory-John-Pierce/dp/0486240614
There's an excellent, inexpensive book I love on the topic. Introduction to information theory: symbols, signals, and noise. I strongly recommend it.
Edit: You don't need to be an expert in maths to get the gist of the book, the exposition is great.
>And that's something that puzzles me. All these AI guys searching for the Secret of ConsciousnessTM, and no one ever stopping to ask, hey, what if there is no secret?
To be fair, there isn't much active research being done among computer scientists in private industry to search for the Secret of Consciousness^TM (as far as I know). While it's cool to dream about futuristic robots and computers that can pass the Turing Test, there isn't much money to be found by directly developing a generalized AI, and academics for the most part don't produce shit in the way of practical science. So private industry focuses on generalizing algorithms only to the extent of performing a specific task, while guys like me hope that eventually these tasks will become general enough that we can arbitrarily decide that we've created a strong AI.
>What if there is no strong AI barrier other than computational limits of modern computers?
>Every time we figure out how the brain does something, we find brute-force computation rather than sophisticated algorithms.
Once you move into the realm of practically unlimited computational power you can dispense of algorithms altogether. But where's the elegance in that?
It reminds me of a funny historical note that took place during the development of Fourier Analysis. While mathematicians were trying to prove the convergence of Fourier series, dragging around the Dirichlet Kernel in all their proofs, the engineers were perfectly happy using the dirac delta approximation. 30 years later when the mathematicians finally came up with a formal proof that allowed them to use the dirac delta approximation as well and were like "look how awesome this is", the engineers were like "duh, where have you guys been."
Point is, there's no elegance in saying that the Secret of Consciousness^TM lies in a brute force approach. That would be admitting that our brains, as awesome as they are, are just glorified roombas.
I'm just an EE guy who likes algorithms on the side, but as a real computer scientist you can probably answer this better than I can - if we disregard computational efficiency, can't every algorithm be explicitly programmed simply using an if/else control flow?
For what it's worth, I'm fairly certain that you are correct that the brain isn't wired efficiently, but has the luxury of getting away with it because it has billions of neurons offering a huge advantage in raw computational potential. Not that this statement is worth much, as it is just a gut feeling that I can't back up.
>Every time we get computers to solve a heuristic problem, we cease to think it as belonging to the field of "artificial intelligence".
Disagree. One can draw a distinction between what is and what isn't a learning algorithm, but I would consider a roomba to be a rudimentary artificial intelligence. But that comes down to personal preference and a lack of precise language.
>Would we stop thinking of ourselves as conscious, self-moving minds? Would we dispense with the notion of "free will"? Or would we merely re-arrange the notion of identity?
You can ask these same questions despite the fact that we don't know what's running under the hood of the consciousness machine. If you believe in science then you believe that everything in nature has some kind of logical explanation without the hocus-pocus of religion and morality, even if modern science isn't there yet.
The only question that remains is where it all began. But that's a discussion for another day.