So I was learning about my fancy new free spectrum analyzer that everyone should have and can be downloaded for free here:
And I noticed something weird. The vertical lines are all spaced out weird. I've just never questioned it before.
Look at A. What's with that god damn huge gap?
I quickly realized why. The lines are marking even frequencies. 200hz, 300hz, etc.
I started to wonder about the way they're spaced out, though. How at the low end, 20hz is represented in one inch, and at the high end, 10,000hz are represented in one inch.
I know that it's done this way because linear displays are awkward as fuck, and it's difficult to tell the different major zones from each other.
My question, then, is who invented this Logarithmic view? Is this abstract, or mathematical?
More importantly, do different spectrum analyzers have different logarithmic views?
Thanks for your time, mixing geniuses and math wizards.