By the way, all of these examples are describing ideal situations that can't really exist here on Earth. An omnidirectional antenna doesn't actually transmit in a sphere, but more like a donut. That's just reality, we can't all live in the deep recesses of space with an infinitely small antenna. But ideal situations are good to study because they tend to increase understanding better than realistic examples.
I couldn't include this statement in my last post because I had exceeded the character count!
That is again a very clear explanation. I think I get it now. Once again, thankyou :)
I have just put this down on paper in diagrams so I have it in my head.
Please would you mind reviewing them to ensure I have it correct.
I have done three pictures to look at the 3 types of antenna. Omni directional, Directional and the Therotical Isotropic. On there I have indicated the same wave but at different points in time. Also, noting that these antennas are pumping out 2.4 billion wavelenths every second. Man, that must be one ocilator (thats a new term for me also) hahah
Also, I just took a sine wave pic off the web and made it bigger on the diagrams just to indicate that the wave is getting bigger, not to indicate any amptitude of frequency change in any way.
One last question if I may. If you take the directional antenna as an example (if any of them are correct that is).
As the wave gets further away from the antenna, the wave gets bigger. Is that correct? as indicated by the sizes of my sine waves on the diagram.
If it is correct, and sorry if you have told me this already as my brain is about to explode, is there any mathmatical releationship between how far the wave propegates from the antenna, to how big the wave gets?
I think I am nearly there but feel free to blast my pics out of the water :))
Once again mate, many thanks indeed, for the kind help.
Ken thats great info ... thanks for sharing ...
Wow, amazing pictures. Far better than I was able to make :)
I really don't like how text books show how waves get bigger as they expand. It is true that they get bigger, but they don't get stronger. Their amplitude doesn't increase, it actually decreases as it travels over free space. It makes sense, because signal strength decreases as we walk away from an access point or cell tower, and we grumble because we can't get coverage.
There's this concept of a "wave front" that describes, for like of better terms, the front or surface of the wave as it travels outward. This expands as it travels and does indeed get bigger, as they say. However, the amplitude gets smaller as the wave front expands.
Tired of analogies yet? Here's another one - Imagine a kid chewing some gum and starting to blow a bubble. As the bubble expands, the AMOUNT of gum doesn't increase, and yet it's getting bigger. The reason is that it's stretching out. The mass of gum continues to stretch thinner and thinner, causing its structural integrity to continue to deteriorate. So the gum is weaker at any given point despite the gum as a whole being bigger. Eventually, the thing pops and the kid has a mess.
For waves, there isn't a single mass of wave, per se, but there's a single amount of energy that the wave contains. This energy starts off very concentrated and "thick", but in a relatively confined space. As the wave travels outward, the surface or wave front continues to get bigger and bigger, but the energy is spread out over a larger area. So the amplitude is decreasing as the wave gets bigger.
The analogy breaks apart when talking about the bubble bursting. The wave never bursts, as we discussed earlier it simply fades to the background noise of the universe (assuming free-space travel out to infinity).
Strange, huh? As for your diagrams, that's the only thing I found to not be quite accurate. Very good and clearly-displayed info. It's clear you've done a lot of learning over the last few days. Major props for trying to figure this stuff out, I think a lot of wireless engineers don't care about understanding the physics behind the wireless signals. I'm with gstefanick, 5 stars for working so hard at this.
You have just made an outstanding point to me. I hope others read this as it is so important!!
"For waves, there isn't a single mass of wave, per se, but there's a single amount of energy that the wave contains. This energy starts off very concentrated and "thick", but in a relatively confined space. As the wave travels outward, the surface or wave front continues to get bigger and bigger, but the energy is spread out over a larger area. So the amplitude is decreasing as the wave gets bigger. "
The point above is excellent, but could just just tell me what you mean when you say "there isn't a single mass of waves 'per se,"
I thought there was is relation to the frequency?
Many thx fella. Now learning about free space loss, antennuation, modulation (i now know the difference between AM and FM) Only after 37 years of being alive :))
AM and FM are very basic types of modulation. The more advanced stuff that WiFi uses is unfortunately far more complicated. I'm sure you're already finding that out though, haha. To be honest, I don't understand a lot of the more advanced modulation schemes. They're quite hard to follow!
I'm sorry for confusing you, I was just trying to distinguish between the bubble gum analogy and the physics of the wave. The point is that a transmitter broadcasts a wave at a certain power level. The energy in that wave, in an ideal environment, will remain the same as the wave expands, similar to how the amount of bubble gum never shrinks as the bubble gets bigger.
So yeah, I was just comparing the energy of the wave to the mass of the bubble gum. The concept is very similar in that regard.
Many thx for all the help buddy :)
It make sense perfectly
Enjoy the gaming this w/e fella.
Brilliant discussion and Jeff is one of the best guys you see here on wireless.
The last three pics you posted showing the isotropic radiator, this is the theoretically perfect radiator of which we only have the sun, ie radiating evenly in all directions.
From my degree days as you move from the source you use an inverse square law to degrade the signal strength as you move away from the source.
I will be reading this thread more thouroughly and my view is that if you get your head around this you will be ahead of many engineers putting in WLANs. You need to be able to visualise what the radio is actually doing to do good planning and also see where the main pitfalls are.
There are some good explanations in the CWNA guide (sorry but it is relevant)
Just my two cents
Many thx for the input on the . That is such an important point with invese square law. I am just getting my head around this :)
Like you say, I want to visulise electromagnetic waves. Am getting there I think with all the great help from you guys.
Did you see this picture in the previos post?
Very interesting view of how a dipole antenna electromagnetic wave looks like
And another interesting picture of an electromagnetic wave in motion
Also, is OFDM and DWDM very similar. One is just using a master RF carrier wave, splitting in into sub RF carrier waves, and DWDM is just using once light frequency, and splitting the light frequency into multiple sub-channels?
Many thanks to all for the excellent help,
Pete, thanks for the kind words :) 5 points for the link, that picture at the top is an excellent visual of a wavefront propagating in free space.
Ken, I wish I could watch the .mov to comment :( I never think to watch it when I'm at home, haha.
OFDM and DWDM are very similar in that regard, at least. Perhaps the only difference is that DWDM uses visible light and OFDM uses invisible light. Great analogy, I'd never really thought of that.
The key point for both those technologies is to realize that signals at different frequencies do not affect each other. It's a bizarre principle that's difficult to understand, or at least it is for me :) I remember in Physics class once seeing this principle in action with a jumprope, though. If you tie one end to an oscillator, and you have someone hold the other end, you can set the oscillator to a small frequency, so it's creating small waves all the way across the jump rope. If the person at the other end suddenly whips the jump rope up and down, he creates a massive wave that travels the opposite direction of the little waves. At first it looks like this giant wave eats the little waves, but you actually see the little waves come out the other side. They don't affect each other because they have different frequencies.
It was helpful to see, but it's still so darn confusing :/ Waves of different frequencies can share the same medium without interfering with each other.
Thats a good experiment to try with the rope. Yes, I would have thought the large wave would eat it up :)
Hi Pete, Is the A = Amptitude in the diagram? Sorry, I cant see it referenced, but am assume it is, so as you say, the signal strength (ie Amptitude) will decrease by the inverse sqaure law.
Amptitude = signal strength correct? I have probably already been told this, but my brain is leaking info at the mo :)
Also, Pete, Jeff, if I took this full circle, can the same be said about ethernet. It is just an ocillated electronic signal producing a sine wave over wire and this has a wavelength and frequency? Is this carrier there all the time, or I read (and probably mis-interpreted it), the carrier signal is only there when data is ready to be modulated?
I need to get myself an occilator to show all of this. Damn interesting stuff. I have been in the networking field for quite a while and have never thought about this as much as I am now, thx to you guys!!!!
So to summaries the athernet point (not very short)
Lets say in Wireless, the frequency is 2.4 Ghz, and the wireless carrier is always there, and only when data needs to go over the airwaves, data is encoded onto the carrier wave. Correct?
Is it the same for an optics carrier signal -The freqency and carrier waves are being generated all the time! if data is ready for transmission or not? Correct?
Lets say I have my PC at home, connected to my ethernet hub. and nothing is being transmitted by my PC or on the wire (theorectically). Is there a constant elctrical signal generating a carrier wave on the wire between my laptop and hub *** OR *** with Ethernet, does a carrier only exist on ethernet when data is ready to be transmitted?
If I get these points confirmed, I think I am there on all topic. It only started out on wireless and has now become generic and man, I have learnt stuff on the way :)
Ken, you're challenging us all to think about things that we often take for granted. Thanks for keeping us on our toes, I know that I've learned quite a bit myself by participating in this discussion!
To answer your first question, Ethernet over copper is not a frequency-based signal. Instead, electricity is used over copper to produce the signal. Generally, placing a positive voltage on the line produces a current in one direction, a binary 1, and placing a negative voltage on the line produces current the other direction, which is a binary 0. This is why Coax and UTP copper have two "cables", because a circuit must be completed between hosts. UTP has 4 pairs, and thus allows for 4 communication streams at once. This is why you can run full duplex using UTP, but with Coax you can only run half-duplex, since there's only one pair of copper (the inner-cable and outer-cable). Since the voltage applied is DC (as opposed to AC), there is no oscillation and no frequency.
I think your definition of a wireless carrier signal is very good. However, DSSS (802.11b) doesn't use a carrier wave, and OFDM (802.11a/g/n) uses carrier waves a bit differently (discussed below). My understanding is that the carrier wave is meant to be a heartbeat of sorts to wireless receivers, keeping them all calibrated and ready to receive more data. However, if an AP were to continually transmit a carrier wave, wireless clients would never be able to talk since the medium would always be busy. I assume (I wasn't able to verify this) that carrier waves can't be used when there are multiple speakers in an environment. But it would be perfect for radio stations and broadcast TV channels.
As for OFDM, all of the above still applies because it only uses carrier waves once it starts talking. Since an OFDM speaker communicates over a multitude of channels at once, each subchannel contains a carrier wave that is modulated to transmit data. This is because the subchannel must remain calibrated when it's not in use so it can be reused at any time. However, once the entire transmission is complete, the transmitter is silent. It doesn't use a "broad" carrier wave.
Something to consider about WiFi is that the start of any WiFi packet contains a series of alternating 0s and 1s for calibration. This is called the preamble. So this takes care of the fact that there is no carrier wave to keep everyone calibrated. Ethernet over copper also uses a preamble to calibrate everyone on the segment. For both WiFi and Ethernet copper, the medium is completely silent if there are no talkers.
I can't speak about fiber, I'm not so knowledgeable about the subject. As discussed earlier, Ethernet over copper doesn't use signals, so it doesn't use carrier waves.
Does that all make sense?
Once again, brilliant explanation. thx very much.
I have also been reading the definition of broadband verses baseband.
I have come to the following conclusions:
Baseband - Ethernet, single carrier using all of the media. Uses TDM to increase transmission capabilitys
Broadband - Wireless, Optical, X21, RS232, ADSL etc. Uses multiple carriers and all use a type of FDM to increase transmission capabilities and all use carrier waves.
I think I have it correct. Its amazing, once you start looking at one thing, it spirals into other.
Many thx for all the help, and hope you have a great time gaming mate :)
Thanks very much to all :)