Go Down

Topic: Questions about WLANS, Wireless Networks, and Wifi (Read 1 time) previous topic - next topic

012anonymousxyz

Oct 14, 2013, 02:24 am Last Edit: Oct 14, 2013, 02:31 am by 012anonymousxyz Reason: 1
Hello! This has zero to do with the Arduino but I come here because everyone is super friendly and technologically orientated.
I want to know if I have some concepts right so I'm going to state facts and if someone would correct me, that would be awesome. I am also going to type questions.

I was reading up on WLANs and 802.11 and radio frequencies and Wifi and Wimax  and I am confused about the hierarchy of it all.

As I understand it, Wifi is a type of WLAN that follows 802.11 standards. Wimax follows 802.16 standards.

First and foremost, I could not for the life of me find ANY other types of WLANS than wimax and wifi. What other options exist?

Additionally, some radio theory:
lower frequency = higher penetrating power and longer range
Higher frequency = higher rate of data transfer.

Therefore, would I be correct in saying that the ultimate goal is lowest frequency with highest data transfer rate? How theoretically would you achieve this? I.e. 802.11g is a standard defined for the 2.4GHz band that standardized speed from 11MBits/s to 54MBits/s (from the old 802.11a standard). I assume some sort of technological invention allowed this. How did it become possible to drastically increase data transfer rate and a lower frequency is what I am asking.

Additionally, what do these standards actually say? Do they only define the frequency and data transfer rate interval (i.e.24Mbps-54Mbps) a type of connection must have in order to classify as it? So put another way: if I can invent some way of transferring 54Mbps on a 2.4GHz band, I get the 'privilege' of saying I follow 802.11g standards?

What are 'channels?'
As I understand it, a channel is a subdivision of a 'band.' So for the 2.4GHz 'band,' channel 1 is 2401MHz to 2422MHz. Then channel 2 is 2406MHz to 2428MHz. Devices can communicate bits through each 'MHz' Frequency. So they can transfer 22 bits for a given time interval. One of the products I was looking at (you will see further down) has 'adjustable channel size support.' Why would you ever want to make a channel bigger or smaller?

Also why can't channels be KHz wide? Modern technology still doesn't have a fine enough resolution? Furthermore, why were channels selected to be specifically 22MHz wide? What advantage does this offer?

Next... the decibel...I hear all these things about decibel powers. The only thing I derived from wikipedia is that the decibel in radio communication is referenced to 1 milliwatt of power. But what does a 'decibel' actually do? Does it effect range? I thought that was frequency. Does it affect data transmittion?
[edit]: I know that decibel is a relative conversion using the log scale of power ratios. What I am saying is, what does having a higher decibel do.

So the reason I'm asking all this is because I am looking to somehow transfer high resolution images from a minimum distance of 3km and maximum of 50km. I need to know the best, most efficient method (with the context that one of the points will be moving at very high velocities).

Last time, the project team used this:
Product: http://www.ubnt.com/wispstationm5
Datasheet: http://dl.ubnt.com/WSM5_DS.pdf

In the datasheet you will notice this:
For receiving, the WispStation M5 has -97 dB sensitivity at 1-24 Mbps with a tolerance of +/-2dB.
Now: what does that actually mean...?

Similarily, on the transmit side, It was 1-24Mbps with an Avg. Tx of 23dB. I found it curious that the 'sensitivity' was replaced with Avg. Tx. Again, what does this mean...?

I guess just some context: I am on a University aerospace team and we are building a UAV. I am a first year and there are some masters students working on it and even phd candidates helping out so I really want to know what I am talking about. Last time during a meeting they really grilled one of the students for their decisions in a gimbal the student designed and it was pretty nasty.

I really really appreciate the help! If some tech-savvy person would explain it in simple language, it would really help me to understand the complex picture. I also really value recommendations if anyone has any for best methods of doing the above (maximizing data transfer for 3km-50km).

nickgammon

Quote

Also why can't channels be KHz wide? Modern technology still doesn't have a fine enough resolution? Furthermore, why were channels selected to be specifically 22MHz wide? What advantage does this offer?


I'm no radio expert but my guess the answer to this is that assuming they use FM, to have a 1 kHz channel you would need to be able to detect frequency changes of 1k / 2400M which is a tiny, tiny amount. (0.0000004). I doubt that the detection circuit could distinguish such tiny frequency changes. I'm probably out by a factor of two somewhere, no doubt the ham radio enthusiasts will set me straight. :)
Please post technical questions on the forum, not by personal message. Thanks!

More info: http://www.gammon.com.au/electronics

mauried

Its a frequency stability problem.
At 2.4 Ghz , to maintain a 1 Khz channel , needs a frequency stability of 4 in 10^7, which whilst possible needs
extremely good oscillators.
Theres also little need for 1 Khz wide channels at 2.4 Ghz .
Generally, the higher frequencies are chosen ,so that faster bit rates can be achieved.
How fast depends on many variables, but the theoretical limit is set by the bandwidth and the signal to noise ratio.
http://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem
Shannons theorem, as its commonly called, explains the relationship between maximum data rate, channel bandwidth
and signal to noise ratio for communication over a noise limited communications channel.



012anonymousxyz

I see. Okay, and critically please, what does this mean:

"For receiving, the WispStation M5 has -97 dB sensitivity at 1-24 Mbps with a tolerance of +/-2dB."

I see this statistic in each datasheet, but I'm confused because db in the log of a ratio. So what does it mean to have -97 dB sensitivity? What is the implication..

012anonymousxyz

#4
Oct 14, 2013, 10:23 pm Last Edit: Oct 14, 2013, 10:26 pm by 012anonymousxyz Reason: 1
Excerpt from Wikipedia:

"Long wave is 153-279 kHz, with 9 kHz channel spacing generally used. Long wave is used for radio broadcasting only in ITU region 1 (Europe, Africa, and northern and central Asia), and is not allocated elsewhere. In the United States, Canada, Bermuda, and U.S. territories, this band is mainly reserved for aeronautics navigational aids, though a small section of the band could theoretically be used for microbroadcasting under the United States Part 15 rules. Due to the propagation characteristics of long wave signals, the frequencies are used most effectively in latitudes north of 50°.

Medium wave is 531-1,611 kHz in ITU regions 1 and 3, with 9 kHz spacing, and 540-1610 kHz in ITU region 2 (the Americas), with 10 kHz spacing. ITU region 2 also authorizes the Extended AM broadcast band between 1610 and 1710 kHz. Medium wave is the most heavily used band for commercial broadcasting. This is the "AM radio" that most people are familiar with.

Short wave is approximately 2.3-26.1 MHz, divided into 14 broadcast bands. Shortwave broadcasts generally use a narrow 5 kHz channel spacing. Short wave is used by audio services intended to be heard at great distances from the transmitting station. The long range of short wave broadcasts comes at the expense of lower audio fidelity. The mode of propagation for short wave is different (see high frequency). AM is used mostly by broadcast services; other shortwave users may use a modified version of AM such as SSB or an AM-compatible version of SSB such as SSB with carrier reinserted."

How can short wave be used for great distances? Lower frequency = higher distances... right? But here higher frequency is higher distance.

Also... khz wide channels... :).

As I understand it btw, under one channel you can only send one stream of data? Channel is the resolution or width for detecting bits. Sort of like, 0V and 5V. If that makes any sense...

msssltd

#5
Oct 15, 2013, 03:38 pm Last Edit: Oct 15, 2013, 04:04 pm by MattS-UK Reason: 1
Caveat.  I am neither a teacher, nor a radio engineer and it is a long time since I scribbled down the notes on this stuff.


I"For receiving, the WispStation M5 has -97 dB sensitivity at 1-24 Mbps with a tolerance of +/-2dB."

I see this statistic in each datasheet, but I'm confused because db in the log of a ratio. So what does it mean to have -97 dB sensitivity? What is the implication..


A dbm is a unit of power relative to 1mw.

Assume you have a transmitter in an environment free of interference and you measure the output power as 1mw at the transmitter output.  1mw = 0dbm.  You send that mw into a x10 high gain antenna and measure the radiated output as 100mw.  100mw = 20dbm.  

As the signal radiates from the antenna, signal density falls with distance, so signal strength is attenuated by distance.  The 100mw output measured at the antenna = 0dbm attenuation.   Start walking  away from the transmitter and continue to measure the signal strength, keeping the antenna in line of site.    When the signal strength falls to 80mw,  it has been attenuated by -1db.  Keep walking away and when the signal strength has dropped to 10mw, it has been attenuated by -10dbm.  Keep walking and when the signal has fallen to 1mw, it has been attenuated by -20db.   At 1uw, the attenuation is -50dbm and at -100dbm the original 100mw signal, is just 10pw.

When we look at the radio receiver, the transmitter power is of little interest.  What is of of interest is the sensitivity, in other words the capability of the receiver to distinguish and demodulate an attenuated signal.  As 0dbm = 1mw.  -97dbm will be somewhere between 1pw and 0.1pw.  

In ideal conditions, the receiver sensitivity value will correlate to a distance, in the same way the transmitters attenuation correlated to a distance.  Electro magnetic waves radiate from an antenna as a rough sphere, so distance is expressed as a radius.    The distance is relative to other factors, such as the signal wavelength, antenna efficiency and environmental losses.  These factors are incorporated in the Friis transmission equation;
Received Power = (Transmit Power x Transmit Gain x Receiver Gain x  wave length x 2) / (losses x (4 x pi x radius) x 2)

Using the Friis equation.  In ideal conditions, a 900Mhz signal, transmitted at 100mw can be demodulated by a receiver with -97dbm sensitivity, ~3Km away.  

In real life, the conditions are never ideal.  If you were paying attention to the implications of the dbm unit,  you may realise    small differences from the ideal, cause a significant drop in useful range.

Quote
How can short wave be used for great distances? Lower frequency = higher distances... right? But here higher frequency is higher distance.


At the very edge of the atmosphere is a layer of charged particles, known as the ionosphere.  When a long wave hits the ionosphere it get's reflected back towards Earth and so, Marconi's original radio transmission was able to travel thousands of miles, further than line of site to the immediate horizon.  At the other end of the radio spectrum are Microwaves, which are not reflected by the ionosphere, so microwave transmission is restricted to line of sight.  In between are a bunch of other electromagnetic frequencies, being reflected, refracted and absorbed differently, by the different materials which comprise the natural world.  Exactly how an electromagnetic wave of a particular frequency interacts with the World, the behaviour it exhibits, determines the application it is good for.  E.g. Theoretically you should be able to heat food with a long wave but it is really, really, really, inefficient.

I ought to add.  Microwaves are particularly suited to digital transmission due to the signal reflections having phase shifts which are easily detected and filtered out.  In short, microwaves are less susceptible to indirect and environmental interference.  The significant downside is the need for orbiting relay satellites to traverse signals over the horizon.

Quote
As I understand it btw, under one channel you can only send one stream of data? Channel is the resolution or width for detecting bits. Sort of like, 0V and 5V. If that makes any sense...


Wi Fi splits the band into between 11 and 14 over-lapping channels.  Each one of those channels is able to host multiple devices and even multiple access points associated to different networks.  That would be a lot more than one stream of data.

You are thinking way too simple.  Wi-Fi transmission is pretty sophisticated stuff.  802.11g uses Direct Sequence Spread Spectrum transmission, with Orthogonal Frequency Division Multiplexing and I won't pretend to know exactly what OFDM over DSSS means.  I have no more than a loose grasp but enough to know you have to think in 3 dimensions, rather than the 2 dimensions of a wired digital transmission.

012anonymousxyz


Assume you have a transmitter in an environment free of interference and you measure the output power as 1mw at the transmitter output.  1mw = 0dbm.  You send that mw into a x10 high gain antenna and measure the radiated output as 100mw.  100mw = 20dbm.  

Surey that is a x100 gain, no?


When we look at the radio receiver, the transmitter power is of little interest.  What is of of interest is the sensitivity, in other words the capability of the receiver to distinguish and demodulate an attenuated signal.  As 0dbm = 1mw.  -97dbm will be somewhere between 1pw and 0.1pw.  

So the higher 'sensitivity,' the lower the dbm is, the better, correct?


You are thinking way too simple.  Wi-Fi transmission is pretty sophisticated stuff.  802.11g uses Direct Sequence Spread Spectrum transmission, with Orthogonal Frequency Division Multiplexing and I won't pretend to know exactly what OFDM over DSSS means.  I have no more than a loose grasp but enough to know you have to think in 3 dimensions, rather than the 2 dimensions of a wired digital transmission.

Alright, I'm not even gonna try to understand then yet. Funny, I often get accused of over-complicating.

I appreciate it!

mauried

OFDM is a technique used by services like digital TV transmitters and also digital radio to transmit
hi bit rate data is a bandwidth limited channel.
Transmitting hi bit rate data is hard, as it gets affected by transmission anomalies like frequency selective fading
and multipath.
To overcome these problems , OFDM was developed which transmits hi bit rate data at a slow bit rate, which might sound impossible.
To do this , the hi bit rate data is broken down into lots of low bit rate channels, which are all stacked next to each other and the whole lot is transmitted together.
At the receiving end , all the low bit rate channels are demultiplexed and the original hi bit rate data is reconstructed.
This makes the transmission quite rugged , and impervious to frequency selective fading and multipath.

Go Up