Quote:
Originally Posted by suzzer99
For some reason n/2 didn't feel right, so I just guessed log(n) based 100% on your post. But it came off like I reasoned it out on the spot. BOOM
Yeah, O(n/2) just isn't a thing - that's just O(n). So is n/100 or n/1e6 or n/1^100 and so forth. So is 10n or 1000n or 10000000000n. All O(n)
I know you don't have a CS background but interviewers who ask about O(n) are really going to harp on this. When you do an algorithm on a whiteboard and someone asks "can you do better" they don't mean "can you make itwice as fast" they mean "can you take it from n^2 to nlogn or n"
As a digression I remember when I was taking discrete math , which was all about combinatorics, graphs, trees, etc, some students were complaining about what was this "for" what purpose or use did it have, and the teacher, who I really liked a lot, got very frustrated. He came back to the next class and said, we're going to write a program to solve this problem. I don't remember exactly what the problem was but it involved counting how many numbers between 1 and n had some specific property.
Together we wrote a program and then he said, can anyone tell me what the performance of this program is and someone said "O(n)" and he was like close but no and then someone figured out that it was O(n logn) because as the numbers got bigger you had to examine more digits. How many digits does the number "n" have? Approximately log10(n). So we're all feeling pretty proud of ourselves.
Then the teacher busts out: you know we could actually just calculate how many numbers have, like, property A and how many have property B and then calculate how many have both. And he writes down a little equation for it and says "O(1). That's what this class is for"