Decomposing a Square Waves


I am revisiting a lot of basics and trying to remove many misconceptions I have developed over the years, but also to look behind sweeping statements I hear often. One such statement is “a square wave is compose of all the frequencies”

From a Fourier perspective I understand this, I have proved it to myself with Excel (as much as one can prove an infinite series with excel!) - we can sum an infinite number of sine waves of varying frequency to achieve any arbitrary waveform (there are some magnificent youtube videos on the subject). Thus we can create a Square wave from sufficient number of sine waves. no problems!

how is the reverse true though? if we take a square wave, lets say generated by someone accurately switching a switch on/off, or the output of 555 or just a transistor switch. There are no sine waves involved - yet many times I have heard of the issues caused with resonance (for example) in that a “sine wave contains all the frequencies” → a square wave into a filter will cause ringing and oscillations… The square wave was not generated as sine waves, there are no sine waves to be found… yet we get issues as if the square wave is actually made of sine waves. If we apply a square wave to an RC filter we can get a triangular wave, and we can filter it down to a sine wave - sure, got that! but still this one section of a resonant frequency being able to be plucked out of a square wave has got be stumped


The idea behind Fourier Transformations is that “any signal can be represented as a combination of fundamental sine waves”. I believe this is different than “a square wave is compose of all the frequencies”

I would decouple this notion from the actual generation of a more complex signal like a square or sawtooth. There will always be simpler ways to generate a signal than combining sine waves.

Instead, I think about this when I want to filter something like a square wave. If I had a perfect filter that cut off just past the fundamental frequency, say a 10 kHz square wave and a perfect “brick wall” filter at 12 kHZ, I should see only the 10 kHz fundamental frequency sine wave come out on the other side. Here’s an article on the matter: Making Waves | Nuts & Volts Magazine.

What’s nice about this is you can go and test this. Take a square wave as an input and measure it on a scope with an FFT function: you should see a set of frequencies on the FFT that represent the fundamentals and harmonics. Then put it through the filter and look at the FFT on the other side and you’ll see those harmonics are much reduced. This is useful in a lot of contexts, but I think it’s a great showcase of frequency content of different types of signals.

1 Like

Here’s a way to think about it intuitively: When you hit a bell with a striker, you’re hitting it with a step function / delta function that imparts energy into the bell. The bell has a step-response displacement which attenuates the energy, but the resonant frequency component of the bell will attenuate much more slowly as the bell rings.

There are also lots of great GIFs that help to showcase this concept in action:


Oh man this is one of my favorite topics.

So I think what this discussion is ultimately getting at is are all waveforms in physical reality actually composed of sine waves of various harmonics all superimposed on each other, or is that just a mathematical artifact of how the Fourier transform is formulated?

Remember, if you look at the spectrum of a waveform, what you’re going to see is the FFT of that waveform. So naturally one will see that the frequency composition of that wave will agree with what a Fourier transform of it says it will be because you’re doing a Fourier transform.

In other words, it shows that Fourier transforms agree with themselves. A bit circular, isn’t it?

However, I am here to tell you that not only is this fundamental to all waves, it underpins reality as we know it.

Quantum mechanics (which is really just waves doing wave stuff. It should be called quantum wave dynamics), for example, has quantized waveforms called wavefunctions. A wave function can be a single particle or an ensemble of them depending.

But this wave function is a complex valued function, with a real and imaginary component that are identical but out of phase by a constant phase shift. The time axis is space/position/distance in this case, however.

If you square the amplitude of both components and add them together, you get the probability of interacting with the thing represented by this wave function. And the phase shift is such that this number is always between 0 (the imaginary part yields a negative number when squared which cancels out the positive probability) and 1, the only valid range for a probability.

Position and energy/momentum are inversely related by the uncertainty principle. The more precisely you localize a particle, the wider the range of energies it can have. And vise versa.

The reason why is that physical reality, at the most fundamental level, physically manifests what one might think was just a mathematical formalism of the Fourier transform.

Imagine a sinewave that you want to turn into a wave packet, one that maybe had a single sharp peak in the center and falls off quickly, like the sinc function.

How would you do that?

Well, you could add harmonics. You can keep adding harmonics and localize this wave into a packet more more tightly, but at the cost of it no longer having a well-defined frequency. Now it exists as a combination of many frequencies superimposed.

Frequency is the same as energy/momentum for waves.

So when you have, say, an electron with a single possible energy level/frequency, it’s wave function is delocalized. It is spread out, it doesn’t exist in any specific location anymore than a sine wave exists at a given location in time.

If you want to localize it, this means injecting energy into it so it can be at multiple energy levels/frequencies. As you increase the uncertainty (add harmonics) of its energy/momentum, you can localize it by making alternate versions of itself where it has different energy levels/frequencies interfere with themselves.

In case the significance isn’t clear, this means that the very concept of location/position is merely an emergent property of the more fundamental wave dynamics that govern our physical reality.

So the Fourier transform has a physical manifestation and it is one that underpins physical reality itself. It is safe to say that it is something deeply fundamental to all continuous superimposable functions (which includes all waves of any basis), and that this is a real, physical thing and not just a result of how the math works.

FYI, this is how flash memory works. A fully isolated MOSFET gate has some charge injected into it via tunnel injection. This works by tightly confining the energy (frequency) of some electrons so they delocalize and one of the possible places they might exist is inside that isolated gate. Once this confinement is removed, some of them will be stuck on that gate.

You just wrote a bit by exploiting the physical manifestation of the Fourier transform. Kind of wild, isn’t it? :smiley:

Quick tangent: remember how I mentioned quantum wave functions are identical real and imaginary waves but locked in phase?

If this wave interacts with stuff over position, this will cause different interactions with the real and imaginary components and they will no longer be identical. This would permit negative probabilities to occur, which is no bueno.

We can add something called the lagrangian, it is a variable we just made up and added to the wave function’s equation. It is defined as a variable that has a value that varies with space/distance such that it ensures the real and complex parts of the wave function remain identical but out of phase even when this phase shift would result in different propagation.

If something that has a value at every location and changes over position sounds like a field, that’s because it is.

That variable we added to keep the real and imaginary components identical turns out to be a field we all hold near and dear:

It is the electromagnetic field.

I know no one asked but it’s so rare I can weasel it into anything and it’s just so cool that I couldn’t help myself.

  • yes! this was my understanding as well
  • absolutely; my apologies for such poor wording! my hope here was to clarify that I had a (basic) understanding of Fourier transforms. But this is good advice in any case.
  • hrmm, i think i need to think about this some more, as it creates a circular argument (i think!)… If I put a perfect square wave, generated by some means - the FFT will tell us that there are harmonics; and a filter will reduce the amplitude of them and again we will see that on a FFT. I have the AD2 and there is a great article on it here: Using the Analog Discovery 2 to Measure Harmonics in a Signal - Digilent Reference - but my fundamental (pun intended!) question is why those harmonics exist in a perfect square wave to begin with! In fact there was a similar question asked on Stack Exchange (Where do overtones in a 555 generated square wave come from?) - though I didn’t find any of the answers “satisfying” - for example here is a small selection, which I think captures the general confusion on this topic, and the mental-model I have right now.

  • “Odd harmonics are fundamental to a square wave as your second illustration shows. (It would be better explained as : a square wave can be decomposed into an infinite series of sine waves)”

  • “It’s not that a square wave can be constructed from sine waves, as some optional thing or wacky view of it. A square is composed of sine waves.”

  • " The FFT (actually a dft) is looking for sinusoidal content by definition."

  • “The square wave is . The 555 created it by switching the output. By feeding it into a spectrum analyser, you’re asking the question ‘what sine waves make up this square wave?’. If you’d fed it into a power meter, you’d be asking the question ‘what’s the power in this square wave?’ BTW, the 555 generates an approximation to a mathematical square wave, because the output voltage can’t switch infinitely fast. It’s pretty fast compared to 2.5kHz, but not fast compared to 100MHz”

  • “It does not mean that whatever generated the signal generated separate sinusoids, only that the resulting signal can be represented in this (very useful) way.”

  • “The key issue is that not only CAN a square wave be constructed from sine waves, it fundamentally IS a collection of sine waves.”

  • the Nuts & Volts article also says “Basically, a square wave consists of a fundamental frequency with a lot of higher harmonics. If the harmonics can be removed, then a sine wave of the fundamental frequency remains.”

The answers here essentially say “there are no sine waves, but the FFT by its nature is able to extract them and show them” and thats ok, i can understand that - it can mathematically break down the square wave it sees. But if we think about some practical implications and think about the signal; a rising edge that experiences ringing - for example, this video from Robert Feranec Everyone designing boards needs to know this about power and noise | Florian Hämmerle|, from the transcript:

“Because the edge of course is step but the step contains all the frequencies… it triggers the resonance and the resonance rings”

this video is talking about if you switch an output at a specific frequency, due to the resistance, inductance and capacitance on the Vcc line, there will be a resonant frequency and if your output switching is the same frequency, you get much higher noise on the Vcc line

in this image, the blue trace is the output GPIO, the yellow is the noise on the Vcc line - you can see the ringing when zoomed in.

Another video from Robert Feranec (and partially what kick started this journey) - Why the circuit in the thumbnail is wrong? Do you know?, specifically at 12:53

“also keep in mind that the square waves or signals with very sharp edges, for example when your chip is communicating with all the other chips or when it is controlling I/O pins for accessing to memory, these kinds of sharp signals they contain all the kind of frequencies

All About Circuits: Square Wave Signals - “When a square wave AC voltage is applied to a circuit with reactive components (capacitors and inductors), those components react as if they were being exposed to several sine wave voltages of different frequencies, which in fact they are.”

What is driving my desire to understand this is, of course, the issue Robert Feranec is referring to (and others) where by sharp changing signals / square waves has negative consequences and being able to consider that impact in designs (which incidentally may not be hardware designs, but as Robert shows you may want to avoid switching outputs at specific frequencies which can be a software choice!)

I may be reading ahead here in to an answer from @metacollin however it makes sense in context here, is the Fourier transform just a neat mathematical trick we can exploit for analytical or design purposes? the answer is, as has been shown many times practically, no! it is a real thing (again, see the Robert Feranec’s video Everyone designing boards needs to know this about power and noise linked above for practical demonstration) - I am just not linking concepts properly yet!

p.s: with regards to Robert’s videos, I have reached out to him, as I wasn’t sure what was actually causing the ringing in the first place.

  • The videos start with the notion that when a MCU switches an output, the MCU will draw more current (reasonable)
  • due to internal resistance, track resistance etc, when more current is drawn, there will be more voltage drop across the stray resistance, and so Vcc will drop. Thus there will be ripple on the Vcc line that has a frequency equal to the output switching frequency (still ok!)
  • it is not pure resistance, rather it is complex impedance and so there will be a resonant frequency (still ok!)

so, is the ringing at the resonant frequency of the Vcc circuit formed by the stray resistance, impedance and capacitance caused by

  1. the ripple frequency matching the resonant frequency or
  2. the square wave nature of the output switching or
  3. combination!

my initial thought was it was due to the ripple, but the conversation between Robert and Florian leads me to believe they were implying the second option - however it is not very clear, hence I have reached out to Robert for clarification.

here a couple of my favourites to add to the list of resources

1 Like
  • excellent! looking forward to another engaging discussion :slight_smile:
  • in both cases, absolutely correct; I hope this is what I conveyed in my reply to Chris, here.

everything you mention sounds very familiar from my college days, and yet that seems such a long time ago, now. Though, if I am correct, distilled to single sound-bites, you are in agreement with these statements

  • “The FFT (actually a dft) is looking for sinusoidal content by definition”
  • “The key issue is that not only CAN a square wave be constructed from sine waves, it fundamentally IS a collection of sine waves.”

the latter being just how the world is?

Looking through the responses here (which are excellent, and I can’t thank you enough) - being Contextual Electronics, what is the practical way to deal/manage this? I am sure I won’t be getting into the depths that @metacollin has provided on a daily basis, but is it valid to say

“the very nature of square waves are that they are composed of a massive number of sine waves; it is not just a cute mathematical trick, but a simple fact of nature. It is a useful property of waves that we can exploit to our advantage, but it can also cause unexpected issues. The underlying mechanisms at work are not entirely clear to me, but the important thing to remember is that sharp transitions, square waves and the like do contain fundamental and harmonic frequencies and not taking these into consideration in hardware design (and software!) can lead to unexpected consequences.”

You don’t care. Except if it is a circuit where you use it to your advantage or if you are dealing with EMC or radios. Then you care


I have to say, thinking about steep edges as an indicator of high frequency components is a very helpful and practical use of these mental/mathematical models. And as Chris said, use it in filters.

1 Like

I agree. My point is just that in general electronics, you often don’t need to worry about it (heavily constrained statement :-))

1 Like

The Fourier transform is nothing more than the most popular way to convert a signal from a time domain representation to a representation based on orthonormal basis functions. It is the most popular because it is convenient, well understood, easy to calculate, conenient, useful, and convenient.

There is an infinite collection of transforms similar to Fourier, all of which are useable but most of which are inconvenient. Depending on what you’re up to, some of these other representations might be convenient. e.g. wavelets, certain types of polynomials, …

It’s a really big stretch to claim that quantum mechanics implies that the the square wave coming out of the MCU is fundamentally made of sine waves. The voltage on that pin is many approximations and orders of magnitude away from the quantum wave functions. The fact that we can use Fourier to talk about the macroscopic wave and we can use Fourier to talk about the quantum effects is convenient, not fundamental. The harmonics of a 10 Hz square wave are emergent effects, they are not the sum of 10 Hz quantum effects.

For switching signal edges the Fourier approach works really well. It can provide good-enough results quickly, and for linear systems it can get arbitrarily close to “perfect” by including arbitrarily many harmonics.

Take a poorly designed I2C bus with way too much bus capacitance compared to its pullup resistor. The rising edge is the classic exponential decay of an RC. You can model this as a square made of harmonic sines that has a severe low pass on it, but an exponential is probably easier. Does that make the exponential “fundamental”?

1 Like

after a nice summer walk, this was the conclusion I reached in the end as well! I fear I am reading into things too much and looking for “perfection”…

Up an Atom did a video recently; The Fourier Series and Fourier Transform Demystified - YouTube

I tend to think it’s more a tool for how we make sense of things and less about what is physically happening. Sure on a deep level the whole reality may be based on waves, but that isn’t want we see on an oscilloscope - we see something start at 0V and end at some VDC (rly rly fast).

Trouble is the response can be so complex we sometimes have little way of making sense of how a circuit responds like that. So, break it down into an approximation. Break it down into the waves which mathematically are equivalent. Their value is they allow us to understand behaviour.

Every conductor and its return path make up a transmission line, with (lossless) series inductances and parallel capacitances in addition to (lossy) series resistance and parallel conductances. These (and similar internal elements of other electronic components) limit the rise and fall times of signals, and are also responsible for the natural sinewave nature of simple harmonic motion.