It wouldn't be good for streaming. But you could store a whole movie library, and just decompress what you want to watch into a buffer on your HD. (I'm assuming you can seek randomly into the data. Otherwise, it would not be practical except for archival purposes.)
The whole argument for binary data is reliability. As the data is carried electrically, all you care about is on/off, high/low. That's difficult to misread. I wonder how the reliability of this new approach would compare.
Edit:
The problem with more symbols is that it could lead to ambiguity - e.g. an octagon could look like a circle or a hexgon, etc. leaving a lot more room for errors in interpretation.
Binary data has the least ambiguity: it's either there or it 's not
That's right along my reliability line of thinking.
There's a lot of skepticism in the comments I read. I think the key is whether this more-analog approach can be brought up to full reliability. Otherwise, it would be a step backward.