Highest Rated Comments


skinwill112 karma

I would argue the biggest difference is how the data is modulated onto the carrier and more advanced frequency hopping. Both use technologies that require greater processor power in the mobile device. Thats what so freaking hilarious about "5G BAD" BS because they are doing more with less power. Some of the added frequencies they use are higher and more fragile therefore they need more towers. In this case they call them micro and pico cells. Dave Jones had a good rude laugh about this https://www.youtube.com/watch?v=4vHx-UyIM9M

skinwill62 karma

That an the propagation of 28-300GHz in open air... I've seen the absorption charts, there are some bands that are better than others but none of it is going to go very far. Like only a few miles or much less when it rains.

I do have a problem with the US FCC deciding to use the same frequencies that NOAA uses to monitor rain. I think it's somewhere around 24GHz. I don't know what the latest news is but the decision to transmit on frequencies used to passively monitor storms will set weather prediction back decades. https://www.aip.org/fyi/2019/noaa-warns-5g-spectrum-interference-presents-major-threat-weather-forecasts

edit: Links

skinwill35 karma

Store OICU812 here, we are all out of turboencabulators.

skinwill30 karma

I work with 60, 80 and 300GHz network relay equipment regularly. We have hardware that transmits a pencil beam that will get dorked up if you are not aligned precisely +/-0.5deg. But the newer equipment doesn't care. It still has a pencil beam but it will beam form and hunt for the target itself. We also have a system that is point to multipoint where one base station can handle many other stations. It is able to beam form many beams without any need to align the hardware other than to point the remote station in the general direction +/-5deg.

Point is, even with beam forming we still get -65dBm 500 feet away. You tell me what that is in watts. LOL