Quantcast
Channel: Internet Infrastructure Archives - High Tech Forum

Multi-Gigabit SOHO Networks are Here

0
0

America’s networking industry isn’t sitting on its laurels.

Xfinity just added a 200 Mbps speed boost to its high-end plans. For perspective, 200 Mbps is more than the average total download speed in all but the seven little countries with the highest download speeds.

AT&T has spun off its media properties, freeing up a lot it capital for investment in its core 5G and FTTH networks. Investors are excited for both the broadband business and the content business.

The first residential gateways (AKA “routers”) that support Wi-Fi 6e – the new 6Ghz spectrum band freed up by the Pai FCC – are on the market now, one from Asus and the other from Netgear. The gateways share a lot of design similarities and both are good for Internet connections faster than one gig.

Gateway Scarcity

Multi-gigabit LANs are rapidly becoming reality now that Realtek and Intel are promoting low-cost (five bucks) Ethernet chips that can run at 10/100/1000/2500 Mbps. Asus and Netgear use the Realtek part, presumably because it’s more mature.

Given that Realtek’s 2.5 Gbps Ethernet part is the same price as Intel’s very popular 1 Gbps part, it’s a no-brainer for OEMs to use it. Intel has dominated the Ethernet chip business since the demise of 3Com, so it’s good to see competition.

Both of the high-end gateways are in short supply, so there’s no question that gamers and other performance-sensitive users are gobbling them up. I suspect they’re both constrained by the tight market for semiconductors or all kinds. An odd consequence of Trump’s trade war with China is hoarding by Huawei and ZTE in advance of their respective bans.

Wi-Fi 6 is a Clear Upgrade

I picked up a Netgear RAXE500 to evaluate both multi-gigabit Ethernet service as well as Wi-Fi 6e. The office LAN runs a mixture of 1, 2.5, and 10 Gbps systems and switches already, so this was an obvious thing to do.

My speed tests say the Xfinity 1.2 Gbps Internet connection runs closer to 1.4 down and 40 Mbps up. This is far in excess of any of the individual apps that we – or anybody else – use, but it still feels faster than the 940 Gbps service.

This probably due in part to the RAXE500 being a better gateway than the Wi-Fi 5 Asus RT-AC88U it replaced. The 802.11ax Wi-Fi in Wi-Fi 6 is both faster and more secure than its predecessor, 802.11ac.

Wi-Fi 6e Isn’t There Yet

Speed tests are at least 20 percent faster on Wi-Fi 6 vs. Wi-Fi 5, but Wi-Fi 6e is slower than regular Wi-Fi 6. It’s hard to evaluate 6e thoroughly, however, so this may change.

There is only one Wi-Fi 6e adapter on the market today, the Intel AX210 module. This is an M.2 card that’s a plug-and-play replacement for the popular Intel AX200 used in a number of laptops for both Wi-Fi and Bluetooth.

The problem is that the Windows driver hasn’t been officially released. This means downloading the beta driver from the Windows Insider program and manually installing it.

6Ghz Has a Lot of Limitations

Wi-Fi 6e is hobbled by all the power limits and sensing requirements the FCC imposed to ensure it wouldn’t interfere with incumbent utilities and other licensees. The good news is that it does penetrate sheetrock walls, but that’s pretty much the end of the list.

Comparing 802.11ac with 80 MHz channels to 802.11ax and 802.11axe with 160 MHz channels reveals that 802.11ax is about 20 percent faster than .ac when running in the 5 GHz band but only 10 percent faster in the 6 GHz band.

My testing is hampered by two issues, however: the Netgear gateway doesn’t have a “best channel scan” for 6 GHz, but it does have one for 5 GHz. It also didn’t play well with WPA 3, the new security mode for .ax, in the 6 GHz band. This latter problem probably has something to do with the beta Windows driver.

Wi-Fi 6 is Impressive in 5 GHz

On the plus side, my Apple and Windows devices all support .ax even though it wasn’t a thing when I bought them. Here’s to software upgrades!

Sadly, not all of my Wi-Fi devices even support 5 GHz, so they don’t get any benefit from the action going on in the higher frequencies. Fortunately, Amazon and Google have been supporting 5 GHz in their home automation devices for a while now.

Wi-Fi 6 has better radio engineering than Wi-Fi 5, and the performance upgrade is noticeable. Speeds are more stable and the Wi-Fi devices are more reliable. The security is also better as WPA 3 plugs some holes.

Next Step

I’m looking forward to testing the Asus ROG Rapture GT-AXE11000 when it shows up on Friday or so. This was the first 6e gateway on the market, so the software should be a bit more polished by now.

I have a number of Asus gateways already, so I’ll be able to try mesh networks that are pure .ax as well as mixtures of .ac and .ax. Asus has a much more sophisticated user interface, so I’ll also be able to tweak  Wi-Fi parameters.

Most of all, I’m hoping to see an auto channel scan in 6 GHz and much faster response from the GUI than Netgear provides. I don’t expect to see better performance in 6 GHz since by working assumption is that the performance impairments I’ve seen are Windows related.

Recommendations

First, don’t hold your breath waiting for 6 GHz Wi-Fi to provide you with the 9.6 Gbps top-end speed vendors tout. That’s nothing but a pipe dream because you’re likely to check in at 700 Mbps on a good channel set.

Next, upgrade to Wi-Fi 6 if you haven’t already. It’s faster, more reliable, and more secure than the ten year old 802.11ac is.

Finally, there’s no compelling reason to upgrade to a 6 GHz system unless you get the upgrade for free in the course of pursuing a faster than 1 Gbps Internet connection or a faster than 1 Gbps LAN in your home or office. Reasons to look for faster than gig speeds on your LAN include lots of cameras and a NAS/file server that serves up frequently-used files to a collection of desktop and laptop computers.

Wrapping Up

Even at 10 Gbps, chances are your NAS will still be slower than a PCIe 4.0 SSD on your local device, even with SSD caching. That’s just life until 40 Gbps interfaces and switches get a lot cheaper.

So:

  • Wi-Fi 5 is over
  • Wi-Fi 6 is amazing
  • Wi-Fi 6e isn’t here yet, but it won’t be long
  • Multi-gigabit LANs are good
  • We don’t have applications for multi-gigabit Internet connections

Most important of all: Future proofing is a sad misconception, but scalable networks are where the action is.

Note: No fiber optic cables were harmed by this evaluation.

The post Multi-Gigabit SOHO Networks are Here appeared first on High Tech Forum.


Comparing Wi-Fi 6E Gateways: Netgear vs. Asus

0
0

In the last post we shared some preliminary impressions of Wi-Fi 6E vs. Wi-Fi 6. 6E is identical to 6 except that 6E uses the 6GHz frequency band while 6 uses the 5GHz band.

In the first test I used a Netgear RAXE500 gateway and an Intel AX210 Wi-Fi module installed in a desktop computer running Windows 10 Pro. The gateway and computer were located in adjacent rooms with one irregular wall separating them.

In this test, I added an Asus GT-AXE11000 in place of the Netgear – in the adjacent room – and moved the Netgear to a same room as the PC, some 10 feet away. This configuration provides insight into the top speed that a 6e user is likely to see with a 1.2 Gbps Internet connection.

Bottom Line First

Neither Wi-Fi setup allowed the PC to reach the top speed the PC gets when operating over its 2.5 Gbps Ethernet connection. On the wire, the PC reaches 1.4 Gbps but the top Wi-Fi 6E speed was a bit less than 1 Gbps.

Asus and Netgear provide similar 6E speeds in the adjacent room scenario, 600 – 700 Mbps. In the same room scenario, Netgear jumps up to 900 – 1000 Mbps, faster than Wi-Fi 6.

Ping times are lower with 6E than 6 at 8 ms vs 10 even when bandwidth is similar. Ping times are lower still with Ethernet, as low as 4ms.

Analyzing the Results

Location makes a big, big difference with Wi-Fi 6E. When the 6E gateway is ten feet from the PC, 6E is substantially faster than Wi-Fi 6.

When the gateways are in the adjacent room, speeds are pretty close with Wi-Fi 6 being just slightly faster. When the separation is two rooms on different floors, Wi-Fi 6 smokes 6E.

This suggests that the ideal scenario for Wi-Fi 6E is gateway and computer in the same room. So why would I use Wi-Fi instead of an Ethernet cable to cover 10 feet? In the real world people are only doing that mobile devices.

Caveats

All that we’ve really learned from this testing is that Wi-Fi 6E is not currently any better than Wi-Fi 6 in a suburban home where 5 GHz is not congested. Testing in a Silicon Valley apartment complex would probably produce different results.

We also can expect to see higher speeds a year from now after the Windows Wi-Fi 6E driver is in better shape. The gateway software is also likely to be better.

One thing I did notice in doing this testing is that Wi-Fi 6 makes mesh networks much less important. I didn’t find any advantage to the Asus gateway using two or three Asus mesh nodes over the single 6E gateway.

Less Obvious Side Effects

Maintaining a Wi-Fi mesh network is something of a pain because many Wi-Fi implementations on consumer products are weak. When you’re connecting things like BBQs, thermostats, wireless thermometers, and garage door openers, a single gateway network is much more reliable than a mesh.

Netgear is better at antenna design than Asus, but Asus gateways are a better bit for tweakers who want to customize all of the Wi-Fi options and settings. The Netgear UI is horrible, with short timeouts that force the administrator to re-login way too often.

Netgear also takes forever to do perfectly simple things. The company makes nice semi-pro Ethernet switches but their consumer products are pretty sad.

Conclusions

While Wi-Fi 6 is considerably better than Wi-Fi 5 in terms of performance, reliability, and security, Wi-Fi 6E offers nothing of value to the typical consumer at this point. Its lower latency has gamer appeal, but it’s not clear that serious gamers use Wi-Fi at all.

It still makes sense for people upgrading from a Wi-Fi 5 802.11ac network to invest in a 6E gateway today. If this is you, you’re the mind of person who uses their gateway for several years, so it’s likely some 6E products will come along in the next 3 – 5 years that have some appeal.

But if you already have a Wi-Fi 6 gateway be happy, you’re good to go. Wi-Fi 6E may or may not promote innovation in the future, but it’s certainly nothing to get excited about today unless you’re a product developer.

The post Comparing Wi-Fi 6E Gateways: Netgear vs. Asus appeared first on High Tech Forum.

Connecting the Unconnected

0
0

In Episode #58 of the podcast, Tom Evslin joins Richard for a discussion and demonstration of Starlink in Vermont. Tom signed up for the Starlink beta and he uses it for our Zoom session.

Tom is a genuine Internet luminary: he pioneered flat rate Internet pricing when he ran the AT&T WorldNet dialup Internet service as well as the use of the Internet to transfer voice. A company he founded, ITXC, was the first major supplier of international VoIP transport across the Internet.

Tom was also a guest on the first High Tech Forum video podcast in 2017. He signed up for the Starlink “better than nothing” beta test and was pleasantly surprised. He joined the podcast via Starlink so this is a great demonstration of the Starlink Low Earth Orbit satellite-based broadband service.

Why Starlink?

Tom lives in rural Vermont where broadband options are severely limited, so he signed up with Starlink to see whether it would be an improvement over existing alternatives. Vermont has a program underway to subsidize rural broadband in unserved areas, but the program is unlikely to provide meaningful alternatives for five years, if ever.

The Starlink FAQs on his blog, Fractals of Change, describe the state of Starlink during the beta. Since then, speed has increased to the 300 Mbps down/30 Mbps up range, with latency in the neighborhood of 45 milliseconds.

This is fast enough for every common Internet application, but it falls short of the symmetrical broadband desired by some broadband policy figures in Vermont and Washington DC today. We’re still waiting for a list of applications that will be liberated by very high speed symmetric service.

The Starlink Technology is Awesome

Tom’s first impression was surprise that there were no instructions in the box when he got his Starlink dish. But he quickly found that he didn’t need them.

When you plug in the dish, it automatically aligns itself to the Starlink constellation and before you know it you’re in business. You can see the real time map we discussed by clicking here.

Tom found the handoffs to be utterly seamless. His initial siting of the dish was partially obstructed, so he experienced a loss of connection for a few seconds every hour. He corrected that by relocating the dish.

Data Centers in Space

The most fun part of the podcast is Tom’s reflections about data centers in space. This is a takeoff on the fact that Starlink has taken to locating ground stations at data centers and building satellites that talk to each other.

That saves money and also provides better performance. I can’t summarize this part effectively, so just listen for it at 26:25 or so.

Starlink also has potential as an alternate backhaul for disaster recovery purposes. The satellites are solar powered and effectively immune to earthly disasters such as hurricanes and wildfires.

Won’t it be ironic when LEO satellite constellations are providing backhaul to FTTH access networks?

The Greatest Dangers are Political

I doubt that LEO will ever completely replace fiber backhaul and mobile networks because of capacity constraints, but it’s an intriguing piece of the puzzle that has to be solved for universal broadband to become reality.

Perhaps their greatest threats are political. We’ve already seen that terrestrial networks with political exposure can be hampered and delayed by political considerations.

5G networks need licenses for small cell siting, and many municipalities are reluctant to issue them when 5G competes with muni networks. Cities can hide behind phony health concerns when dragging their feet, but they have additional motives to delay the 5G rollout.

Vermont Wants to Protect Government Networks

At 33:15 we get into the political issues that came to the fore when Broadband Equity NOW! – the organization Tom and his wife Mary formed – petitioned the Vermont legislature to use Starlink and other existing alternatives for immediate broadband needs. Vermont’s reaction was refusal because it doesn’t want competition for its government-owned networks.

This harkens back to the themes in our last podcast, in which John Horrigan made it clear that the greatest barrier to broadband adoption is lack of information about available networks and subsidy programs.

The Digital Divide is often sold to us by politicians as a civil engineering problem, but the reality is that it’s a social problem driven by lack of interest, income, and information. For every person unable to buy high quality broadband at any price, there are three or more who can but don’t because they don’t know why or how to get connected and where to find help paying the bill.

Understanding the Requirements

Policy makers would do well to learn one of the fundamental insights of computer engineering: a successful product development cycle begins with a good understanding of the customer’s requirements. A failed project usually begins with a fantasy about a cool solution into which a user need must be shoehorned.

The failed project scenario is all we hear from Washington and many of the states today. The iron seems to be hot, so long time advocates of government created fiber networks are eager to brand all the livestock in sight with fiber.

When we begin with the requirements we quickly find that there are many ways to satisfy them. At this point it’s more prudent to continue to rely on innovation to meet needs rather than declare one and only one technology the permanent victor.

This was a fun podcast to do, so it should be fun to watch.

 

The post Connecting the Unconnected appeared first on High Tech Forum.

IIJA: Good Start, Long Way to Go

0
0

Now that the massive Infrastructure Investment and Jobs Act (IIJA) has passed the Senate the time has come to see what’s in it and what’s not. On the “not” side we note one major omission: mobile.

When IIJA mentions mobility, it’s generally in conjunction with smart city and smart manufacturing pilot projects. When it mentions wireless, it’s generally in connection with the ban on texting while driving.

Wireless data now exceeds purely wired data on the Internet, so the emphasis on wired infrastructure is sadly out of date. The communication needs of the American people cannot be satisfied by wires alone.

Even within the scope of providing broadband to US households, wireless has a role to play. Within the scope of providing broadband to all homes today, that role is enormous.

How We Got Here

The omission of wireless technology and mobile service from the IIJA reflects the blinkers many of the Hill’s broadband specialists have worn for the entire 21st century. When the advocacy for all-fiber broadband networks emerged in the late ’90s, wireless networks were all about phone calls and their deployment was taking care of itself.

FTTH advocates (I like to say “fiber bigots”) believed they’d discovered the one magic technology to rule them all. In the grip of this delusion, they never paid much attention to the things you can’t do with cable, even one as high capacity as optical fiber.

While this opinion is most common among Democrats, the party whose broadband policy ranks are dominated by lawyers  and professors of law, it’s bipartisan. Republicans, the party that relies on economists to fill its policy ranks, were led down the merry fiber path by George Gilder and friends.

Solving Today’s Problems Today

Without question, the top-down view of a nation’s broadband infrastructure is dominated by fiber. The major aggregation centers are interconnected by highway- and rail-side fiber links; the pathways to other countries are fiber pipes below the oceans; and major population centers are served by ISPs at switching centers where all of the data comes in and goes out over fiber.

All of the wires in the big picture carry information aggregated from thousands or tens of thousands of end users, so high-capacity cables are absolutely necessary. But things look very different when we analyze the universal service problem from the bottom up.

The challenge for the unconnected residence is how to reach existing infrastructure. The unconnected want to solve this problem today; solving it for the future doesn’t matter as much because there won’t be  a future if we don’t solve today’s problems today.

IIJA Isn’t All Bad

With respect to the part of the broadband infrastructure problem that IIJA does address, its solutions aren’t all bad. Contrary to the wishes of the four Senators who tried to force symmetrical networks on the country, the bill sets two asymmetric guidelines for eligible projects: the current 25/3 for unserved areas and 100/20 for underserved ones [Division F, Sec. 60102. Grants for Broadband Deployment]

The bill is also devoid of spin about “future-proof” networks, although some advocates continue to use this misleading terminology.

In place of standards and terminology intended to erase wireless from the picture, a new trope has emerged: “reliable networks”, to wit: “Access to affordable, reliable, high-speed broadband is essential to full participation in modern life in the United States.” [Sec. 60101. Findings]

Unreliable Views of Reliability

It’s unclear what this is supposed to mean. In one section, the IIJA declares an interest in smart transportation grids of: “Vehicles that send and receive information regarding vehicle movements in the network and use vehicle-to-vehicle and vehicle-to-everything communications to provide advanced and reliable connectivity.”

This is a mobile application. But in others it’s a throw-in modifier for “broadband.” The definition doesn’t help:

The term ‘‘reliable broadband service’’ means broadband service that meets performance criteria for service availability, adaptability to changing end-user requirements, length of serviceable life, or other criteria, other than upload and download speeds, as determined by the Assistant Secretary in coordination with the Commission.[Sec. 60102. Grants for Broadband Deployment]

Performance criteria for service availability are easy to define: percent uptime over some span of measurement time, such as hours of downtime per year, month, or week. But adaptability to changing end-user requirements sounds a lot like “future-proofness”. Like length of serviceable life it’s more subjective intuition than measurable reality.

And other criteria is an invitation to make arbitrary exclusions, like the exclusion of everything that’s not a government-owned fiber to the home network. I hope I’m wrong.

Lawmaker Bias v. Technical Reality

Washington has failed pretty miserably in its attempts to design broadband networks: net neutrality remains a turgid attempt to prevent self-dealing by banning legitimate engineering practices instead of by identifying offenders and prosecuting them.

Similarly, we’re still not a “future-proof” nation with respect to the visions of the future common at the turn of the 21st century. That future was eclipsed by the mobile reality in which we live.

Lawmakers admire laws that stand the test of time, but technologists seek to overturn ancient regimes in favor of new ones. When this happens, the old laws run out of serviceable life and need to be replaced by new ones.

Insisting that dynamic technology markets behave like stone tablets does us all a disservice. The realms of law and technology can hardly be more different.

Stay in Your Lane, Congress

Diverse voices are speaking out on the shortcomings of IIJA’s biased approach and incomplete solution. In an op-ed, former Democratic FCC commissioner and chair Mignon Clyburn and highly respected Republican commissioner Rob McDowell touted the efficiency of fixed wireless networks.

Fixed wireless offers a competitive option for many consumers, particularly in underserved markets where competition is lacking and fiber deployment is lagging. A comprehensive fiber network connecting every home can take years to deploy. In these situations, fixed wireless technology fortunately can provide a high-quality, lower-cost solution that can be deployed more rapidly than fiber. The capital cost per subscriber for fixed wireless is nearly 10 times less than fiber and deployment is measured in months not years, making it an effective and speedy method to connect rural, unserved and underserved communities. Furthermore, fixed wireless broadband puts downward pressure on consumer prices by bringing more competition to underserved markets.

And a recent news article expresses the desire of rural America to get better mobile service:

Mike Bucy, a fire chief based in Loon Lake, Wash., said the lack of cell service has been frustrating this summer as firefighters battle some of the worst blazes in years. They can’t always send the latest information to the public, call in extra resources, or exchange updates with neighboring firefighting forces, he said.

Better communications for first-responders is an issue in rural Idaho, too, says Chip D’Amato, executive vice president of Inland Cellular, a wireless telecom company in Lewiston, Idaho, about 140 miles south of Loon Lake. First-responders usually direct their pleas for better communications to his company “because it’s our community,” he said.
The 1998 law professor’s or stock market tout’s vision of arbitrarily fast, symmetric FTTH for urban couch potatoes falls short of today’s needs for mobile broadband and adds unnecessary delay to achieving universal broadband service.
Congress should re-prioritize broadband subsidies to meet the needs of urban poor, forgotten rural areas, and mobile services. We live in 2021, let’s start acting like it.

The post IIJA: Good Start, Long Way to Go appeared first on High Tech Forum.

Congress Digs Into Broadband

0
0

Today’s House Communications and Technology subcommittee hearing on twelve small broadband bills isn’t going to break any new ground. The bills cover a wide range of issues, but none is significant enough to warrant its own hearing.

Two of the witnesses – Cheryl A. Leanza of the United Church of Christ and Tim Donovan of the Competitive Carriers Association – will address specifics of a few of the bills while the other two – Loveland Colorado city council member John Fogle and Todd Brandenburg of small wireless company PocketiNet – will tout specific ways of building broadband networks.

The network advocates will likely deliver the most interesting testimony in light of the overall emphasis on broadband infrastructure in this Congress. Their testimony should resonate beyond the formal scope of the hearing, and I suspect there will be some fireworks between them.

John Fogle and Municipal Networks

Fogle runs a computer repair and home automation installation business in Loveland, a town of 75,000 people that neighbors Fort Collins in northern Colorado. He chairs the Information Technology and Communications Committee of the National League of Cities (NLC), lobbyist for America’s 19,000 cities and towns.

Loveland is one of the four Colorado towns in the municipal broadband business. Like the others – Longmont, Fort Collins, and Estes Park – Loveland provides water and power through government-owned utilities.

Loveland, Fort Collins, and Estes Park have entered into an Inter-Governmental Agreement (IGA) for network engineering and customer service as none has the capacity to support a broadband customer base 24×7. Effectively, each city is a retailer for a common wholesale network.

Munis Out 0f Phase

Fogle’s testimony is self-contradictory in some respects. While he maintains that munis are uniquely positioned to bridge the digital divide, he admits they come up short in three key dimensions: financial strength, technical skill, and scale.

He asks Congress to heal these deficiencies by subsidizing construction costs, technical capacity building, and middle mile networks that can be shared across municipal markets. While money helps, munis have some structural incentives of their own that stand in the way of their ability to effectively meet user needs.

In particular, munis are out of phase with the nation on wireless. Fogle delivers a well-worn talking point from turn of the century broadband debates:

While mobile connections are a vital service, they are not a complete substitute for fixed, in-home high-speed connections that can be used by multiple people simultaneously, and are certainly no substitute when educational interface is needed for children.[Page 4]

This is a problem because three of the six residential broadband options in Loveland today are the 5G wireless services provided by the major carriers. While these networks support mobile devices, they also support residential service to the same customer premises equipment used by Pulse, the Loveland network.

Mobile and Fixed are Complements

I doubt that more than a handful of people regard fixed and mobile as competitors anymore. We need mobile devices because people are mobile, and we need fixed connections because homes are full of devices that need to be connected to the Internet all the time.

Certainly, a man who sells home automation services knows this. People use their mobile devices to communicate with cameras and home automation devices all the time; it’s called the Internet of Things.

Mobile looks like a competitor if you’ve convinced taxpayers to support the town’s sixth broadband option by making outlandish promises. Every ad for 5G residential service is something to fear if you’ve been elected by promising more than you can deliver.

Adversarial Reasoning Leads to Bad Outcomes

The cities included in the Larimer County IGA all tried to enact unlawful zoning ordinances for 5G small cells in hopes of protecting their broadband network from competition. The principal anti-competitive features of the ordinances are overly large minimum separation distances between small cells, unreasonable setbacks from residential properties, and unwarranted safety guidelines.

These features are common throughout Colorado because of the influence of the Colorado Communications and Utility Association (CCUA), a twenty-five-year-old lobbying group of cities with broadband dreams. CCUA members always write 600 foot separation distances into their ordinances to discourage residential 5G service, even though there’s no real justification for any minimum separation.

Small cells are the functional equivalent of street lights that can be placed 250 feet apart. But there’s always somebody ready to freak out over each modification to neighborhood architecture.

Complaints About Preemption

The Ninth Circuit upheld federal preemption of restrictive ordinances that impair of the 5G rollout a year ago, in a case in which CCUA was a party. Federal preemption is necessary because of NIMBY-ism and the weak analytical capacity of city councils.

Some of the council hearings on these ordinances are truly frightening as they tend to bring out the same people who regard climate change and COVID-19 as hoaxes. I’ve written about them in other posts.

Fogle tries to justify CCUA and NLC opposition to preemption:

The National League of Cities opposes federal preemptions of local permitting and review processes that impose by-right or deemed granted requirements, or unduly restrict the ability of local governments to assess adequate and appropriate compensation for permit review or the use of public property. These one-size-fits-all mandates are an unnecessary overreach that hamper the ability of local governments to balance deployment speed with other community needs, and do not meaningfully contribute to closing the digital divide.[Page 8]

But it’s not just about money; these organizations want to hamstring regional and national carriers with all sorts of absurd separation requirements, fees, safety studies, and public notices. They want more money from permitting fees, but protecting those muni networks comes first.

Slowing down 5G deployment does not help bridge the digital divide, of course.

Just Say No

In areas where there is no high quality broadband today, it makes perfect sense for munis and electric co-ops to build fiber networks that  accommodate 5G small cells as well as residences. In my view, it doesn’t make any sense for them to build the sixth broadband network at taxpayer expense.

But if the taxpayers are willing to subsidize a sixth network, it will be built, however. When local governments embark on this path they need to do so on their own dime. Greenfield networks deserve heavy subsidies, but bandwagon networks do not.

The priority for Congress in the Wednesday hearing to to draw a bright line between network projects in legitimate need of federal support for construction, technical capacity development, and backhaul and those, like Loveland, that are simply vanity projects.

Let’s spend tax money where there’s a real need to satisfy.

The post Congress Digs Into Broadband appeared first on High Tech Forum.

Will Rinehart on Broadband Infrastructure and Inclusion

0
0

H.R.3684 – the House Infrastructure Investment and Jobs Act – includes a $65B kicker for broadband networks in its trillion dollar appropriation package. This funding will extend broadband to unserved areas using a formula based in the ratio of unserved areas in the various states.

Funding is in the form of grants to states to be administered according to specified guidelines. The definition of broadband in the bill – 25 mbps down, 3 up, and reasonable latency – is conservative but realistic.

The bill is not bad, but it could have been worse. Today’s podcast, recorded in July, is a reminder of the progress made in the debate. Initial plans floated by some members of Congress focused on ultra-high-speed symmetrical plans that would have taken the better part of a decade to build.

The current package recognizes that some broadband installed today is better than a futuristic system that may never be installed. Economist Will Rinehart, an expert on the economics of bridging the digital divide, is the special guest. If you’re interested in broadband, competition, digital inclusion, and how public policy moves from idea to appropriation this is for you.

The post Will Rinehart on Broadband Infrastructure and Inclusion appeared first on High Tech Forum.

Show Your Cards, FAA

0
0

The FAA is playing dirty.

After failing to win support from the administration and Congress for its unwarranted ban on deployment of mid-band 5G anywhere in the country, the agency is leaning on the firms it regulates to make its case. And they’re pressing the case by selective leaks to friendly media.

It’s like the Title II net neutrality war all over again, when former chairman Genachowski leaked a plan to impose Title II and Wheeler leaked a plan not to. Both leaks turned out to be false, of course.

Today’s Reuters Leak

Today the FAA (or an ally) leaked an alleged letter to Secretary of Transportation Buttigieg from executives at Boeing and Airbus to David Shepardson of Reuters. No one has published the full text of the letter, but it seems to forecast dire consequences.

“5G interference could adversely affect the ability of aircraft to safely operate,” the letter said, adding it could have “an enormous negative impact on the aviation industry.”

The industry and Federal Aviation Administration (FAA) have raised concerns about potential interference of 5G with sensitive aircraft electronics like radio altimeters.

The FAA this month issued airworthiness directives warning 5G interference could result in flight diversions. The agency plans to provide more information before Jan. 5.

The Boeing Airbus letter cited an analysis from trade group Airlines for America (A4A) that if the FAA 5G directive had been in effect in 2019, about 345,000 passenger flights and 5,400 cargo flights would have faced delays, diversions or cancellations.

This would be an abrupt about-face for Boeing from the position it shared with the FCC during the public comment phase of its enabling regulation. Boeing said a 100 MHz guard band would provide the necessary protection to obsolete altimeters. The FCC responded by more than doubling the guard band to 220 MHz, which should have solved the problem.

Why the About-Face?

There’s nothing wrong with a change of position driven by new and better information. The Boeing letter apparently cites a study by the aviation industry’s lobbyist, Airlines for America:

The Boeing Airbus letter cited an analysis from trade group Airlines for America (A4A) that if the FAA 5G directive had been in effect in 2019, about 345,000 passenger flights and 5,400 cargo flights would have faced delays, diversions or cancellations.

Unfortunately for us the A4A study, like the Boeing/Airbus letter, is secret. While it’s not impossible for lobbyists to produce great technical insights never before seen in public discourse, it’s not exactly routine.

Based on past declarations from the aviation industry, I expect the key variable in A4A’s analysis is supported by nothing but hand-waving. That variable would be the power flux density of the 5G mid-band emissions that actually strike altimeters.

Where’s Your Propagation Model, Captain?

We’ve already seen incomplete studies from aviation in this matter. The industry’s think tank, RTCA, shared one a year ago. As we wrote, RTCA had another industry player test some altimeters in a lab setting to see how much power it took in adjacent bands to make them fail.

There would be an answer to that question for any perfectly safe neighboring system, but that’s the easy question rather than the right one. The right question is whether the threshold level is at all likely to exist in the real world.

That’s actually a tricky question because it depends on knowledge the aviation industry lacks. To answer it one needs to understand how 5G signals propagate, how they’re encoded, and how their modulation interacts with airplane surfaces and sensors.

How to Ensure Safety

Setting the safety threshold for the 5G mid-band needs to be a collaborative effort because neither the aviation nor the wireless industry possesses full and perfect knowledge of the other’s world. Such circumstances occur over and over.

We’ve faced similar problems in determining the effect of potential interference between unlicensed 5G and Wi-Fi, Wi-Fi and Bluetooth, GPS and cellular, Wi-Fi 6E and utility networks, and GEO, LEO, and terrestrial data services in 12 GHz.

Answering questions about radio interference is what the FCC and NTIA’s Institute for Telecommunications Sciences do for a living and they’re rarely wrong. But the FCC and ITS approach it as an engineering question while aviation has other priorities.

A Heavily Subsidized Industry

Aviation’s priorities were on display in last week’s oversight hearing in the Senate. While most industries have weathered the worst of the COVID-19 pandemic to date with the Paycheck Protection Program, aviation got its own special deal, the much more generous Payroll Support Program (PSP).

The hearing was a described by members as a lovefest, and that was putting it mildly. Missing from the hearing was discussion of aviation’s rapacious spectrum appetite; only Senators Blackburn and Young brought it up, and then only briefly.

None of the witnesses mentioned 5G in their written or oral statements, and when asked they emphasized that their worry was the FAA’s rash flight restrictions rather than any sort of real operational issue. Go to 2:01:35 in the hearing video to hear the head of Southwest Airlines on the FAA’s erratic behavior.

Airlines Aren’t Rocking the Boat

If this were a real problem, the CEOs who testified to Senate Commerce on December 15th would have raised it on their own. The fact that they didn’t ask for FCC intervention when pressed is also telling.

So what we have there is the judgment of a lobbying group looking to keep its sugar daddy regulator happy and two vulnerable firms – with somewhat checkered safety histories – playing the same game.

The FAA is perfectly within its authority to ground aircraft for safety reasons, but it needs to be held accountable for emergency orders that bypass the normal public comment process. Perhaps today’s media shenanigan is an attempt to avoid the next logical step in the administrative law process.

FAA Needs to be Accountable to the Public

A regulator can bypass protocol when necessary, but each such action needs to be temporary. Keeping the 5G mid-band permanently out of the hands of the firms who’ve paid good money for the right to use it is outrageous without good reasons and strong evidence.

If the FAA and the industry it regulates can produce such data, I’d like to be the first to praise it for doing such great work. But it’s a long way from clearing that bar.

For starters, they need the measurements of radio propagation in the real world that can inform a realistic predictive model. They don’t have this data because they’ve never put sensors on airplanes and recorded readings. They also don’t appear to have an accurate inventory of all the potentially vulnerable altimeters in the US.

The Money at Stake is Huge

Investment dynamics are probably hard for government agencies to grasp. In this instance, the bidders who won the mid-band auction paid a premium for rapid clearing. They’re now getting robbed by the FAA because once clear spectrum is no longer useable.

It’s not conventional for agencies to compensate industries for their costly errors, but aviation is in a unique position thanks to the PSP. If the FAA and the airlines really believe that 5G poses an unacceptable risk to aviation, then let them use their PSP money to license the spectrum at issue until the problem is solved.

Faced with a choice between using the PSP subsidy for spectrum rights and spending it on route expansion, I’ve got a pretty good idea of the choice they’ll make. To understand how changes in tax law, spending for expedited clearing, and bidding for licenses interact with the FAA’s probably imaginary fears, see this column by Roger Entner.

In brief, the FAA has a lot of explaining to do. Instead of playing this game of media leak-a-thon with secret studies and mystical data, the time has come for the FAA to come clean and show its cards.

UPDATE: A4A and CTIA agreed today (Dec. 22) to share data. That was quick.

The post Show Your Cards, FAA appeared first on High Tech Forum.

Will Rinehart on Broadband Part Two

0
0

This is the second and final part of our conversation with Will Rinehart on broadband infrastructure plans. (First part is here.)

We discuss some of the biases, information gaps, and challenges that have to be overcome in order to extend high-quality broadband to all populated parts of rural America. We also discuss the fact that broadband inclusion isn’t really a problem the nation can solve through construction projects alone.

The broadband portion of the infrastructure bill is a huge improvement over some of the early plans floated by Democratic progressives because it has a focus on immediate steps that can and must be taken for the sake of immediate progress.

Rather than building for a far distant future that may never come to pass, Congressional spending on the problem of encouraging people to connect to current networks solves 80% of the inclusion problem. The economics of competition work very differently in markets with high fixed costs. These markets work better with a consumer welfare focus than with the competition focus.

Enjoy!

The post Will Rinehart on Broadband Part Two appeared first on High Tech Forum.


The National Technology Innovation Administration

0
0

Wednesday’s hearing in the House Communications and Technology subcommittee features NTIA Administrator Alan Davidson, the missing witness from the Aviation subcommittee’s hearing on the FAA two weeks ago. The FAA’s meltdown over the 5G C-Band came about from a lack of leadership, wherein the three critical players are the FCC, the FAA, and NTIA.

The hearing memo doesn’t call out the FAA issue, and neither does Davidson’s rather terse written testimony. But it would be a major missed opportunity if the issue didn’t get some attention.

I expect today’s press releases from the FCC and NTIA are intended to blunt some of the criticism of the two agencies’ lack of coordination. In it, they promise to join each other’s advisory councils and meet monthly. That’s a good start, but it doesn’t go very far.

Too Much Politics, Not Enough Tech

The aviation industry needs 800 MHz of contiguous spectrum to figure out the altitude of an airplane. That consists of the authorized 200 MHz for direct signaling plus two guard bands of 300 MHz apiece above and below the authorized band; 200 + 300 + 300 = 800.

Aviation cried wolf because the FCC was only willing to give them 220 MHz guard bands. For context on these figures, kindly appreciate that broadcast television as a whole only requires 328 MHz, from 470 – 806 MHz, and much of that is shared.

Why does aviation need almost three times as much spectrum as the entire TV broadcasting industry just to suss out the distance between the belly of an airplane and the ground? They’ve never really explained this, but if they did the song and dance would probably come down to “that’s what we needed in 1955 and we still use the same gear.”

The Efficient Administration of the Status Quo

When your approach to inter-agency coordination is “let’s have more meetings!” it’s not surprising that the overall focus of the government spectrum policy is maintaining the status quo. That’s what goes on in these meetings the FCC and NTIA plan to attend.

Tune into a webcast of the FCC’s TAC and CSRIC and you’ll hear a lot of lobbyists and retirees talking about reliability and security. Check out NTIA’s CSMAC and you’ll hear much of the group discussing the same questions. Both agencies pay lip service to more efficient use of spectrum across their respective portfolios – government for CSMAC and the private sector for TAC – but that’s as far as it goes.

Because spectrum is over-allocated to government and to unlicensed uses, the companies that have to pay for spectrum rights do all of the work on making spectrum use more efficient and powerful. The government interest is in maintaining control of its out-sized holdings and the unlicensed industry simply wants more and more free spectrum.

Reducing Government’s Spectrum Holdings

Congress can remedy the stagnation in government spectrum use by passing a law requiring government as a whole to cut its reliance on first-right or excusive access to spectrum in half over the next five years. It should be allowed to meet the goal by contracting its systems that use spectrum today to market suppliers (as FirstNet has,) or by re-engineering its systems to be more efficient and releasing the excess to the FCC for auction.

FirstNet’s operator is free to sell services to the private sector after it has met its service obligation to government. That should be the normal way government agencies access wireless systems.

The first question that should be put to Administrator Davidson should be “How can we reduce aviation’s reliance on mid-band spectrum to a level that’s not absolutely disgraceful?” That probably won’t happen Wednesday, but it is fundamental.

How We Got Here

US government agencies have never addressed technology in the holistic way the private sector does. This isn’t a lack of virtue, it’s built-in to an incentive structure. Aviation gets top marks when there are no 737 MAX fiascos, not when some new app pops up on mobile devices.

So government’s use of technology has always been driven by the desire of each agency to perform each task without making headlines. The private sector looks at aviation’s 800 MHz sounding rope and asks “what could I do with a quarter of that?”

I think that’s the better question. When lawmakers put aviation on a diet perhaps agencies will begin asking the better questions themselves.

Attitudes of Scarcity Lead to Better Results

The Western United States is running out of water. The worst drought in 1,200 years is ravaging the region and there are no signs of remission.

The only way through this is get better at managing and using water than we have been. While RF spectrum isn’t in a similar crisis yet, it’s wise to prepare for an eventuality where demand far outstrips supply. Spectrum, like water, is a finite resource at each point in time even if both are reusable.

For the past twenty years, tech policy wags such as Larry Lessig have touted the virtues of a new normal based on the assumption of abundance.

My position has always been that we should regulate as lightly as we can to get a network where the business model of network owners is abundance, not scarcity. That means that network owners aren’t pricing access and striking exclusive deals with content providers with the purpose of exploiting (and hence profiting from) scarcity.

So as I said at the F.C.C. hearing (and two years before during at least three events in Washington, D.C.), my judgment is that a ban on discriminatory access is all that it necessary to achieve this objective.

This notion is based in expectations that Moore’s Law would last forever, which it probably won’t. We had better use its last years to get a head start on the more efficacious uses of spectrum that may come about in the future.

What Can NTIA Do to Help?

Most of Congress’s priorities for NTIA – and NTIA’s own priorities – are sound. Closing the Digital Divide would be a good thing, as would better safety and security for Internet users and holding China’s international ambitions in check.

But we’re not going to improve digital inclusion by building more networks and we’re not going to balance spectrum use in the future by sending more people to meetings.

We need lawmakers to set ambitious goals for the growth of the tech sector and for government efficiency. This planet will soon be home to 10-12 billion people and we’re not manufacturing more of the finite resources they’re all going to need to live well.

With such a goal in mind, tinkering with the roles, responsibilities, incentives, and calendars of government agencies may become a bit more tractable. Let’s start asking NTIA to be the National Technology Innovation Administration.

The post The National Technology Innovation Administration appeared first on High Tech Forum.

Eric Schmidt’s Spectrum Agenda

0
0

Each change in presidential administrations reanimates old, rejected ideas in technology policy. While tech policy was once largely bipartisan, today it’s a bitter battlefield where basic facts are disputed and pathways to common goals are contested.

Case in point is the spectrum allocation policy positions championed by former Google front man Eric Schmidt. During the Obama Administration, Schmidt and his Microsoft counterpart Craig Mundie assembled a cast of semi-technicals – lawyers, policy advocates, public relations experts, investors, and academics – to put their names on a dysfunctional spectrum management plan.

The Schmidt/Mundie plan was issued in 2012 by the President’s Council of Advisors on Science and Technology as a Report to the President bearing the ponderous title: “Realizing the Full Potential of Government-Held Spectrum to Spur Economic Growth.” I was not thrilled with the PCAST report at its inception, nor with the 2019 follow up by Schmidt’s minion Milo Medin for the Defense Innovation Board. They haven’t stood up well.

The Dysfunctional PCAST Report

President Obama originally charged PCAST to develop a plan for releasing government spectrum rights to the private sector where they could be used to benefit the public. Instead of delivering such a plan, the PCAST report and its successors produced flimsy excuses for continuing to allow the government – especially the Defense Department – to hamstring wireless innovation by starving the private sector of spectrum rights:

PCAST finds that clearing and reallocation of Federal spectrum is not a sustainable basis for spectrum policy due to the high cost, lengthy time to implement, and disruption to the Federal mission. Further, although some have proclaimed that clearing and reallocation will result in significant net revenue to the government, we do not anticipate that will be the case for Federal spectrum. (PCAST report at vi.)

I addressed the claim that the well established practice of upgrading (or sunsetting) legacy systems to reduce their spectrum footprints and selling off the excess as flexible use licenses was “unsustainable” in a paper on the “upgrade-and-repack” practice I presented at the TPRC conference in 2013. In essence, spectrum rights transfers are as sustainable as is innovation itself. Innovation is where spectrum comes from.

The claim that reassigning spectrum rights takes too long – decades in the PCAST report’s estimation – is contradicted by reality. The first phase of the C-Band reallocation (from GEO satellites to 5G,) took less than two years from FCC Report and Order to deployment.

SES C-Band Timeline

The major stumbling block to C-Band reallocation was the government itself, with FAA raising last minute objections. Meanwhile, the CBRS system based on the PCAST model has proved to be nothing more than a solution in search of a problem.

Dysfunction Loves Dysfunction

While commercial operators regard PCAST sharing as impractical, DoD continues to pretend it’s beautiful. Writing for the Defense Innovation Board on 5G, Google’s Medin parrots his PCAST report’s (he’s a signatory)  sustainability (“…status quo of spectrum allocation is unsustainable”) and delay claims:

The average time it takes to “clear” spectrum (relocate existing users and systems to other parts of the spectrum) and then release it to the civil sector, either through auction, direct assignment, or other methods, is typically upwards of ~10 years.  (DIB  5G Study at 10.)

We now know this estimate assumes zero cooperation on the part of the vacating party. It’s a question of motivation as the satellite operators were compensated for speedy relocation.

To its credit, the DIB 5G study also pans CBRS:

There is precedent for successful spectrum-sharing – in 2010, the FCC opened up the 3550-3700 MHz bandwidth (known as Citizens Broadband Radio Service, or CBRS) to the commercial sector. However, this process took more than five years, a timeframe that is untenable in the current competitive environment. (DIB 5G Study at 11.)

Medin’s alternative to upgrade-and-repack and CBRS is even worse, however. It wants 5G operators to share a single network:

DoD should encourage other government agencies to incentivize industry to adopt a common 5G network for sub-6 deployment. Incentives can include: accelerated depreciation, tax incentives, low interest loans and government purchase of equipment and services. (DIB 5G Study at 28.)

Medin’s justification – improved security – doesn’t follow from experience. If attackers can focus on a single network, their job becomes much easier than attacking a dozen or so. Redundancy is a key element of reliability.

Inventing New Arguments for Old Ideas

Oblivious to the judgments of history, Schmidt still hawks his spectrum policy pink elephants on the pages of the Wall Street Journal and through venues controlled by his Schmidt Futures investment fund. The only change is the replacement of the discredited PCAST claims of sustainability and delay with even more outlandish claims about performance:

AT&T’s and Verizon’s new 5G networks are often significantly slower than the 4G networks they replace…America’s average 5G mobile internet speed is roughly 75 megabits per second, which is abysmal. In China’s urban centers 5G phones get average speeds of 300 megabits per second.

These nonsensical claims aren’t supported by impartial observations. Open Signal says US 5G users see download speeds in the 200 Mbps range. Our coverage is vastly better than China’s; the China card is the last refuge of scoundrels.

Open Signal Midband 5G Assessment

What is Schmidt’s Agenda?

Eric Schmidt and people close to him have been bashing US spectrum policy for ten years. It’s not clear to me why they’re doing this.

Schmidt is not a spectrum engineer, neither schooled nor self-taught. After embracing a whole new architecture for spectrum sharing in the 2012 PCAST report, Schmidt has switched to a “government first” approach that seeks to model the US economy after China.

The PCAST sharing model was never going to work, but Schmidt appears to have backed it with full faith and passion despite a complete lack of evidence. I think we can say the same thing about his current “lets’s be like China” model. China is actually lagging the US on all the important dimensions of 5G deployment.

A Reliable Spectrum Pipeline

The only consistency here is the lack of consistency – and a lack of study. Instead of chasing a series of shiny objects the US needs a predictable and reliable practice for transferring spectrum rights among and between old and new applications.

The US needs to create a system that keeps spectrum licenses in circulation, like dollars in the economy. Every technical system that uses spectrum today will be obsolete some day.

The spectrum pipeline acknowledges that fact and leverages it to make spectrum licenses available to the technical systems of tomorrow. I’ll dig deeper into that subject in forthcoming posts.

The post Eric Schmidt’s Spectrum Agenda appeared first on High Tech Forum.





Latest Images