Algorithm complexity and "big O" notation are pretty important to know about if you are doing any real programming (possibly not so much if you are just gluing bits of GUI together or something basic like that though).
It can be the difference between having something totally unusable taking minutes/hours to run and knowing (or being able to devise) a clever O(log n) or O(n log n) method of doing the same thing taking seconds / real-time. Same goes for memory consumption (although that is less important than it used to be).
Also, having some understanding will help you avoid common noob pitfalls at the design stage like trying to solve a variant of the Travelling Salesman Problem, wasting a few days writing your code, and then wondering why it's taking forever and/or just never terminates.
There is a lot of other theory and discrete maths that you'll learn in the first year of your Comp Sci degree which will never actually be any use (to a programmer at least), but a decent "arsenal" of algorithms and knowing something about algorithm complexity is essential IMO.
Juk
Last edited by jukofyork; 03-27-2011 at 01:43 PM.