- Joined
- Oct 12, 2009
- Messages
- 23,909
- Reaction score
- 11,003
- Location
- New Jersey
- Gender
- Male
- Political Leaning
- Libertarian - Right
Thats fine and you may have other issues with your current provider - my point has nothing to do with those other issues but is talking about 70mb is better than 20mb. Quality and consistency are more important than speed as I've described.That's a great theory and all, except we have gigabit internet in places like Kansas City where google fiber is located. The cost is 70 bucks per month plus a 300 dollar installation fee. I'd gladly pay that if it were offered.
Your speed is throttled by the provider - all providers do this. Fast does not equal quality - the packet delay, packet drops, resends, and every hop in between your house and the provider or the location you are trying to connect to, which traverse multiple providers networks all effect not only speed but quality. The simplistic view that fiber to prem is better is not true nor accurate. Even a full fiber network end to end doesn't necessarily mean faster or better though in theory, the distortion, noise and transmission should make for a cleaner signal end to end.Peter Grimm said:Are you gonna sit there and tell me it isn't faster than this time warner 15mbps garbage I have now? Unlikely. People who have Google Fiber rave about it. Even if you're only getting half of the advertised speed, that's still 500mbps for basically the same price I'm paying now for 15mbps.
They're not bottlenecks they are throttled on purpose - load balancing means to slow down faster connections and speed up slower connections to a specific measure - actual bottlenecks are managed by large network providers by diverting traffic or rerouting traffic through large SNRC's. Company's like Verizon and AT&T manage these bottlenecks as well as sometimes people who dig up large copper bundles and fiber bundles by rerouting the traffic through mid-point network connections while the breaks are fixed. That is not the same as managing the normal traffic flow, speed and quantity.Peter Grimm said:Also, you're talking about bottlenecks. Well, if we were wired for google fiber, maybe the bottleneck would be somewhere other than the fiber optics. Fine, what does that do? That puts pressure on other parts of the system to upgrade their services, to upgrade their products. So yes, it's a positive as technical advancement always is.
The idea that getting a gigabit connection and paying for such to your house will drastically effect your speed surfing the internet is incorrect. As I said, service providers love that misconception as people will over pay for speed upgrades which are meaningless. Businesses however pay for larger connections which run through private networks and use point to point connections or back haul connections to dump large amounts of data. They bypass the standard load balancing since their data traffic is specific and follows a pre-determined pathway. Gigabit therefore, for hospitals, architecture firms etc., who dump very large amounts of data (we're talking about 500 meg to 10 Gigabit per minute between say Chicago and New York is specifically engineered and uses redundancy such that any connection which goes down or a router dies, a secondary connection / router is used --- this is done using HSRP or VRRP with BGP-R or eBGP depending on the type of network used.