Quantcast
Channel: Internet Infrastructure Archives - High Tech Forum

IIJA: Good Start, Long Way to Go

$
0
0

Now that the massive Infrastructure Investment and Jobs Act (IIJA) has passed the Senate the time has come to see what’s in it and what’s not. On the “not” side we note one major omission: mobile.

When IIJA mentions mobility, it’s generally in conjunction with smart city and smart manufacturing pilot projects. When it mentions wireless, it’s generally in connection with the ban on texting while driving.

Wireless data now exceeds purely wired data on the Internet, so the emphasis on wired infrastructure is sadly out of date. The communication needs of the American people cannot be satisfied by wires alone.

Even within the scope of providing broadband to US households, wireless has a role to play. Within the scope of providing broadband to all homes today, that role is enormous.

How We Got Here

The omission of wireless technology and mobile service from the IIJA reflects the blinkers many of the Hill’s broadband specialists have worn for the entire 21st century. When the advocacy for all-fiber broadband networks emerged in the late ’90s, wireless networks were all about phone calls and their deployment was taking care of itself.

FTTH advocates (I like to say “fiber bigots”) believed they’d discovered the one magic technology to rule them all. In the grip of this delusion, they never paid much attention to the things you can’t do with cable, even one as high capacity as optical fiber.

While this opinion is most common among Democrats, the party whose broadband policy ranks are dominated by lawyers  and professors of law, it’s bipartisan. Republicans, the party that relies on economists to fill its policy ranks, were led down the merry fiber path by George Gilder and friends.

Solving Today’s Problems Today

Without question, the top-down view of a nation’s broadband infrastructure is dominated by fiber. The major aggregation centers are interconnected by highway- and rail-side fiber links; the pathways to other countries are fiber pipes below the oceans; and major population centers are served by ISPs at switching centers where all of the data comes in and goes out over fiber.

All of the wires in the big picture carry information aggregated from thousands or tens of thousands of end users, so high-capacity cables are absolutely necessary. But things look very different when we analyze the universal service problem from the bottom up.

The challenge for the unconnected residence is how to reach existing infrastructure. The unconnected want to solve this problem today; solving it for the future doesn’t matter as much because there won’t be  a future if we don’t solve today’s problems today.

IIJA Isn’t All Bad

With respect to the part of the broadband infrastructure problem that IIJA does address, its solutions aren’t all bad. Contrary to the wishes of the four Senators who tried to force symmetrical networks on the country, the bill sets two asymmetric guidelines for eligible projects: the current 25/3 for unserved areas and 100/20 for underserved ones [Division F, Sec. 60102. Grants for Broadband Deployment]

The bill is also devoid of spin about “future-proof” networks, although some advocates continue to use this misleading terminology.

In place of standards and terminology intended to erase wireless from the picture, a new trope has emerged: “reliable networks”, to wit: “Access to affordable, reliable, high-speed broadband is essential to full participation in modern life in the United States.” [Sec. 60101. Findings]

Unreliable Views of Reliability

It’s unclear what this is supposed to mean. In one section, the IIJA declares an interest in smart transportation grids of: “Vehicles that send and receive information regarding vehicle movements in the network and use vehicle-to-vehicle and vehicle-to-everything communications to provide advanced and reliable connectivity.”

This is a mobile application. But in others it’s a throw-in modifier for “broadband.” The definition doesn’t help:

The term ‘‘reliable broadband service’’ means broadband service that meets performance criteria for service availability, adaptability to changing end-user requirements, length of serviceable life, or other criteria, other than upload and download speeds, as determined by the Assistant Secretary in coordination with the Commission.[Sec. 60102. Grants for Broadband Deployment]

Performance criteria for service availability are easy to define: percent uptime over some span of measurement time, such as hours of downtime per year, month, or week. But adaptability to changing end-user requirements sounds a lot like “future-proofness”. Like length of serviceable life it’s more subjective intuition than measurable reality.

And other criteria is an invitation to make arbitrary exclusions, like the exclusion of everything that’s not a government-owned fiber to the home network. I hope I’m wrong.

Lawmaker Bias v. Technical Reality

Washington has failed pretty miserably in its attempts to design broadband networks: net neutrality remains a turgid attempt to prevent self-dealing by banning legitimate engineering practices instead of by identifying offenders and prosecuting them.

Similarly, we’re still not a “future-proof” nation with respect to the visions of the future common at the turn of the 21st century. That future was eclipsed by the mobile reality in which we live.

Lawmakers admire laws that stand the test of time, but technologists seek to overturn ancient regimes in favor of new ones. When this happens, the old laws run out of serviceable life and need to be replaced by new ones.

Insisting that dynamic technology markets behave like stone tablets does us all a disservice. The realms of law and technology can hardly be more different.

Stay in Your Lane, Congress

Diverse voices are speaking out on the shortcomings of IIJA’s biased approach and incomplete solution. In an op-ed, former Democratic FCC commissioner and chair Mignon Clyburn and highly respected Republican commissioner Rob McDowell touted the efficiency of fixed wireless networks.

Fixed wireless offers a competitive option for many consumers, particularly in underserved markets where competition is lacking and fiber deployment is lagging. A comprehensive fiber network connecting every home can take years to deploy. In these situations, fixed wireless technology fortunately can provide a high-quality, lower-cost solution that can be deployed more rapidly than fiber. The capital cost per subscriber for fixed wireless is nearly 10 times less than fiber and deployment is measured in months not years, making it an effective and speedy method to connect rural, unserved and underserved communities. Furthermore, fixed wireless broadband puts downward pressure on consumer prices by bringing more competition to underserved markets.

And a recent news article expresses the desire of rural America to get better mobile service:

Mike Bucy, a fire chief based in Loon Lake, Wash., said the lack of cell service has been frustrating this summer as firefighters battle some of the worst blazes in years. They can’t always send the latest information to the public, call in extra resources, or exchange updates with neighboring firefighting forces, he said.

Better communications for first-responders is an issue in rural Idaho, too, says Chip D’Amato, executive vice president of Inland Cellular, a wireless telecom company in Lewiston, Idaho, about 140 miles south of Loon Lake. First-responders usually direct their pleas for better communications to his company “because it’s our community,” he said.
The 1998 law professor’s or stock market tout’s vision of arbitrarily fast, symmetric FTTH for urban couch potatoes falls short of today’s needs for mobile broadband and adds unnecessary delay to achieving universal broadband service.
Congress should re-prioritize broadband subsidies to meet the needs of urban poor, forgotten rural areas, and mobile services. We live in 2021, let’s start acting like it.

The post IIJA: Good Start, Long Way to Go appeared first on High Tech Forum.


Congress Digs Into Broadband

$
0
0

Today’s House Communications and Technology subcommittee hearing on twelve small broadband bills isn’t going to break any new ground. The bills cover a wide range of issues, but none is significant enough to warrant its own hearing.

Two of the witnesses – Cheryl A. Leanza of the United Church of Christ and Tim Donovan of the Competitive Carriers Association – will address specifics of a few of the bills while the other two – Loveland Colorado city council member John Fogle and Todd Brandenburg of small wireless company PocketiNet – will tout specific ways of building broadband networks.

The network advocates will likely deliver the most interesting testimony in light of the overall emphasis on broadband infrastructure in this Congress. Their testimony should resonate beyond the formal scope of the hearing, and I suspect there will be some fireworks between them.

John Fogle and Municipal Networks

Fogle runs a computer repair and home automation installation business in Loveland, a town of 75,000 people that neighbors Fort Collins in northern Colorado. He chairs the Information Technology and Communications Committee of the National League of Cities (NLC), lobbyist for America’s 19,000 cities and towns.

Loveland is one of the four Colorado towns in the municipal broadband business. Like the others – Longmont, Fort Collins, and Estes Park – Loveland provides water and power through government-owned utilities.

Loveland, Fort Collins, and Estes Park have entered into an Inter-Governmental Agreement (IGA) for network engineering and customer service as none has the capacity to support a broadband customer base 24×7. Effectively, each city is a retailer for a common wholesale network.

Munis Out 0f Phase

Fogle’s testimony is self-contradictory in some respects. While he maintains that munis are uniquely positioned to bridge the digital divide, he admits they come up short in three key dimensions: financial strength, technical skill, and scale.

He asks Congress to heal these deficiencies by subsidizing construction costs, technical capacity building, and middle mile networks that can be shared across municipal markets. While money helps, munis have some structural incentives of their own that stand in the way of their ability to effectively meet user needs.

In particular, munis are out of phase with the nation on wireless. Fogle delivers a well-worn talking point from turn of the century broadband debates:

While mobile connections are a vital service, they are not a complete substitute for fixed, in-home high-speed connections that can be used by multiple people simultaneously, and are certainly no substitute when educational interface is needed for children.[Page 4]

This is a problem because three of the six residential broadband options in Loveland today are the 5G wireless services provided by the major carriers. While these networks support mobile devices, they also support residential service to the same customer premises equipment used by Pulse, the Loveland network.

Mobile and Fixed are Complements

I doubt that more than a handful of people regard fixed and mobile as competitors anymore. We need mobile devices because people are mobile, and we need fixed connections because homes are full of devices that need to be connected to the Internet all the time.

Certainly, a man who sells home automation services knows this. People use their mobile devices to communicate with cameras and home automation devices all the time; it’s called the Internet of Things.

Mobile looks like a competitor if you’ve convinced taxpayers to support the town’s sixth broadband option by making outlandish promises. Every ad for 5G residential service is something to fear if you’ve been elected by promising more than you can deliver.

Adversarial Reasoning Leads to Bad Outcomes

The cities included in the Larimer County IGA all tried to enact unlawful zoning ordinances for 5G small cells in hopes of protecting their broadband network from competition. The principal anti-competitive features of the ordinances are overly large minimum separation distances between small cells, unreasonable setbacks from residential properties, and unwarranted safety guidelines.

These features are common throughout Colorado because of the influence of the Colorado Communications and Utility Association (CCUA), a twenty-five-year-old lobbying group of cities with broadband dreams. CCUA members always write 600 foot separation distances into their ordinances to discourage residential 5G service, even though there’s no real justification for any minimum separation.

Small cells are the functional equivalent of street lights that can be placed 250 feet apart. But there’s always somebody ready to freak out over each modification to neighborhood architecture.

Complaints About Preemption

The Ninth Circuit upheld federal preemption of restrictive ordinances that impair of the 5G rollout a year ago, in a case in which CCUA was a party. Federal preemption is necessary because of NIMBY-ism and the weak analytical capacity of city councils.

Some of the council hearings on these ordinances are truly frightening as they tend to bring out the same people who regard climate change and COVID-19 as hoaxes. I’ve written about them in other posts.

Fogle tries to justify CCUA and NLC opposition to preemption:

The National League of Cities opposes federal preemptions of local permitting and review processes that impose by-right or deemed granted requirements, or unduly restrict the ability of local governments to assess adequate and appropriate compensation for permit review or the use of public property. These one-size-fits-all mandates are an unnecessary overreach that hamper the ability of local governments to balance deployment speed with other community needs, and do not meaningfully contribute to closing the digital divide.[Page 8]

But it’s not just about money; these organizations want to hamstring regional and national carriers with all sorts of absurd separation requirements, fees, safety studies, and public notices. They want more money from permitting fees, but protecting those muni networks comes first.

Slowing down 5G deployment does not help bridge the digital divide, of course.

Just Say No

In areas where there is no high quality broadband today, it makes perfect sense for munis and electric co-ops to build fiber networks that  accommodate 5G small cells as well as residences. In my view, it doesn’t make any sense for them to build the sixth broadband network at taxpayer expense.

But if the taxpayers are willing to subsidize a sixth network, it will be built, however. When local governments embark on this path they need to do so on their own dime. Greenfield networks deserve heavy subsidies, but bandwagon networks do not.

The priority for Congress in the Wednesday hearing to to draw a bright line between network projects in legitimate need of federal support for construction, technical capacity development, and backhaul and those, like Loveland, that are simply vanity projects.

Let’s spend tax money where there’s a real need to satisfy.

The post Congress Digs Into Broadband appeared first on High Tech Forum.

Will Rinehart on Broadband Infrastructure and Inclusion

$
0
0

H.R.3684 – the House Infrastructure Investment and Jobs Act – includes a $65B kicker for broadband networks in its trillion dollar appropriation package. This funding will extend broadband to unserved areas using a formula based in the ratio of unserved areas in the various states.

Funding is in the form of grants to states to be administered according to specified guidelines. The definition of broadband in the bill – 25 mbps down, 3 up, and reasonable latency – is conservative but realistic.

The bill is not bad, but it could have been worse. Today’s podcast, recorded in July, is a reminder of the progress made in the debate. Initial plans floated by some members of Congress focused on ultra-high-speed symmetrical plans that would have taken the better part of a decade to build.

The current package recognizes that some broadband installed today is better than a futuristic system that may never be installed. Economist Will Rinehart, an expert on the economics of bridging the digital divide, is the special guest. If you’re interested in broadband, competition, digital inclusion, and how public policy moves from idea to appropriation this is for you.

The post Will Rinehart on Broadband Infrastructure and Inclusion appeared first on High Tech Forum.

Show Your Cards, FAA

$
0
0

The FAA is playing dirty.

After failing to win support from the administration and Congress for its unwarranted ban on deployment of mid-band 5G anywhere in the country, the agency is leaning on the firms it regulates to make its case. And they’re pressing the case by selective leaks to friendly media.

It’s like the Title II net neutrality war all over again, when former chairman Genachowski leaked a plan to impose Title II and Wheeler leaked a plan not to. Both leaks turned out to be false, of course.

Today’s Reuters Leak

Today the FAA (or an ally) leaked an alleged letter to Secretary of Transportation Buttigieg from executives at Boeing and Airbus to David Shepardson of Reuters. No one has published the full text of the letter, but it seems to forecast dire consequences.

“5G interference could adversely affect the ability of aircraft to safely operate,” the letter said, adding it could have “an enormous negative impact on the aviation industry.”

The industry and Federal Aviation Administration (FAA) have raised concerns about potential interference of 5G with sensitive aircraft electronics like radio altimeters.

The FAA this month issued airworthiness directives warning 5G interference could result in flight diversions. The agency plans to provide more information before Jan. 5.

The Boeing Airbus letter cited an analysis from trade group Airlines for America (A4A) that if the FAA 5G directive had been in effect in 2019, about 345,000 passenger flights and 5,400 cargo flights would have faced delays, diversions or cancellations.

This would be an abrupt about-face for Boeing from the position it shared with the FCC during the public comment phase of its enabling regulation. Boeing said a 100 MHz guard band would provide the necessary protection to obsolete altimeters. The FCC responded by more than doubling the guard band to 220 MHz, which should have solved the problem.

Why the About-Face?

There’s nothing wrong with a change of position driven by new and better information. The Boeing letter apparently cites a study by the aviation industry’s lobbyist, Airlines for America:

The Boeing Airbus letter cited an analysis from trade group Airlines for America (A4A) that if the FAA 5G directive had been in effect in 2019, about 345,000 passenger flights and 5,400 cargo flights would have faced delays, diversions or cancellations.

Unfortunately for us the A4A study, like the Boeing/Airbus letter, is secret. While it’s not impossible for lobbyists to produce great technical insights never before seen in public discourse, it’s not exactly routine.

Based on past declarations from the aviation industry, I expect the key variable in A4A’s analysis is supported by nothing but hand-waving. That variable would be the power flux density of the 5G mid-band emissions that actually strike altimeters.

Where’s Your Propagation Model, Captain?

We’ve already seen incomplete studies from aviation in this matter. The industry’s think tank, RTCA, shared one a year ago. As we wrote, RTCA had another industry player test some altimeters in a lab setting to see how much power it took in adjacent bands to make them fail.

There would be an answer to that question for any perfectly safe neighboring system, but that’s the easy question rather than the right one. The right question is whether the threshold level is at all likely to exist in the real world.

That’s actually a tricky question because it depends on knowledge the aviation industry lacks. To answer it one needs to understand how 5G signals propagate, how they’re encoded, and how their modulation interacts with airplane surfaces and sensors.

How to Ensure Safety

Setting the safety threshold for the 5G mid-band needs to be a collaborative effort because neither the aviation nor the wireless industry possesses full and perfect knowledge of the other’s world. Such circumstances occur over and over.

We’ve faced similar problems in determining the effect of potential interference between unlicensed 5G and Wi-Fi, Wi-Fi and Bluetooth, GPS and cellular, Wi-Fi 6E and utility networks, and GEO, LEO, and terrestrial data services in 12 GHz.

Answering questions about radio interference is what the FCC and NTIA’s Institute for Telecommunications Sciences do for a living and they’re rarely wrong. But the FCC and ITS approach it as an engineering question while aviation has other priorities.

A Heavily Subsidized Industry

Aviation’s priorities were on display in last week’s oversight hearing in the Senate. While most industries have weathered the worst of the COVID-19 pandemic to date with the Paycheck Protection Program, aviation got its own special deal, the much more generous Payroll Support Program (PSP).

The hearing was a described by members as a lovefest, and that was putting it mildly. Missing from the hearing was discussion of aviation’s rapacious spectrum appetite; only Senators Blackburn and Young brought it up, and then only briefly.

None of the witnesses mentioned 5G in their written or oral statements, and when asked they emphasized that their worry was the FAA’s rash flight restrictions rather than any sort of real operational issue. Go to 2:01:35 in the hearing video to hear the head of Southwest Airlines on the FAA’s erratic behavior.

Airlines Aren’t Rocking the Boat

If this were a real problem, the CEOs who testified to Senate Commerce on December 15th would have raised it on their own. The fact that they didn’t ask for FCC intervention when pressed is also telling.

So what we have there is the judgment of a lobbying group looking to keep its sugar daddy regulator happy and two vulnerable firms – with somewhat checkered safety histories – playing the same game.

The FAA is perfectly within its authority to ground aircraft for safety reasons, but it needs to be held accountable for emergency orders that bypass the normal public comment process. Perhaps today’s media shenanigan is an attempt to avoid the next logical step in the administrative law process.

FAA Needs to be Accountable to the Public

A regulator can bypass protocol when necessary, but each such action needs to be temporary. Keeping the 5G mid-band permanently out of the hands of the firms who’ve paid good money for the right to use it is outrageous without good reasons and strong evidence.

If the FAA and the industry it regulates can produce such data, I’d like to be the first to praise it for doing such great work. But it’s a long way from clearing that bar.

For starters, they need the measurements of radio propagation in the real world that can inform a realistic predictive model. They don’t have this data because they’ve never put sensors on airplanes and recorded readings. They also don’t appear to have an accurate inventory of all the potentially vulnerable altimeters in the US.

The Money at Stake is Huge

Investment dynamics are probably hard for government agencies to grasp. In this instance, the bidders who won the mid-band auction paid a premium for rapid clearing. They’re now getting robbed by the FAA because once clear spectrum is no longer useable.

It’s not conventional for agencies to compensate industries for their costly errors, but aviation is in a unique position thanks to the PSP. If the FAA and the airlines really believe that 5G poses an unacceptable risk to aviation, then let them use their PSP money to license the spectrum at issue until the problem is solved.

Faced with a choice between using the PSP subsidy for spectrum rights and spending it on route expansion, I’ve got a pretty good idea of the choice they’ll make. To understand how changes in tax law, spending for expedited clearing, and bidding for licenses interact with the FAA’s probably imaginary fears, see this column by Roger Entner.

In brief, the FAA has a lot of explaining to do. Instead of playing this game of media leak-a-thon with secret studies and mystical data, the time has come for the FAA to come clean and show its cards.

UPDATE: A4A and CTIA agreed today (Dec. 22) to share data. That was quick.

The post Show Your Cards, FAA appeared first on High Tech Forum.

Will Rinehart on Broadband Part Two

$
0
0

This is the second and final part of our conversation with Will Rinehart on broadband infrastructure plans. (First part is here.)

We discuss some of the biases, information gaps, and challenges that have to be overcome in order to extend high-quality broadband to all populated parts of rural America. We also discuss the fact that broadband inclusion isn’t really a problem the nation can solve through construction projects alone.

The broadband portion of the infrastructure bill is a huge improvement over some of the early plans floated by Democratic progressives because it has a focus on immediate steps that can and must be taken for the sake of immediate progress.

Rather than building for a far distant future that may never come to pass, Congressional spending on the problem of encouraging people to connect to current networks solves 80% of the inclusion problem. The economics of competition work very differently in markets with high fixed costs. These markets work better with a consumer welfare focus than with the competition focus.

Enjoy!

The post Will Rinehart on Broadband Part Two appeared first on High Tech Forum.

The National Technology Innovation Administration

$
0
0

Wednesday’s hearing in the House Communications and Technology subcommittee features NTIA Administrator Alan Davidson, the missing witness from the Aviation subcommittee’s hearing on the FAA two weeks ago. The FAA’s meltdown over the 5G C-Band came about from a lack of leadership, wherein the three critical players are the FCC, the FAA, and NTIA.

The hearing memo doesn’t call out the FAA issue, and neither does Davidson’s rather terse written testimony. But it would be a major missed opportunity if the issue didn’t get some attention.

I expect today’s press releases from the FCC and NTIA are intended to blunt some of the criticism of the two agencies’ lack of coordination. In it, they promise to join each other’s advisory councils and meet monthly. That’s a good start, but it doesn’t go very far.

Too Much Politics, Not Enough Tech

The aviation industry needs 800 MHz of contiguous spectrum to figure out the altitude of an airplane. That consists of the authorized 200 MHz for direct signaling plus two guard bands of 300 MHz apiece above and below the authorized band; 200 + 300 + 300 = 800.

Aviation cried wolf because the FCC was only willing to give them 220 MHz guard bands. For context on these figures, kindly appreciate that broadcast television as a whole only requires 328 MHz, from 470 – 806 MHz, and much of that is shared.

Why does aviation need almost three times as much spectrum as the entire TV broadcasting industry just to suss out the distance between the belly of an airplane and the ground? They’ve never really explained this, but if they did the song and dance would probably come down to “that’s what we needed in 1955 and we still use the same gear.”

The Efficient Administration of the Status Quo

When your approach to inter-agency coordination is “let’s have more meetings!” it’s not surprising that the overall focus of the government spectrum policy is maintaining the status quo. That’s what goes on in these meetings the FCC and NTIA plan to attend.

Tune into a webcast of the FCC’s TAC and CSRIC and you’ll hear a lot of lobbyists and retirees talking about reliability and security. Check out NTIA’s CSMAC and you’ll hear much of the group discussing the same questions. Both agencies pay lip service to more efficient use of spectrum across their respective portfolios – government for CSMAC and the private sector for TAC – but that’s as far as it goes.

Because spectrum is over-allocated to government and to unlicensed uses, the companies that have to pay for spectrum rights do all of the work on making spectrum use more efficient and powerful. The government interest is in maintaining control of its out-sized holdings and the unlicensed industry simply wants more and more free spectrum.

Reducing Government’s Spectrum Holdings

Congress can remedy the stagnation in government spectrum use by passing a law requiring government as a whole to cut its reliance on first-right or excusive access to spectrum in half over the next five years. It should be allowed to meet the goal by contracting its systems that use spectrum today to market suppliers (as FirstNet has,) or by re-engineering its systems to be more efficient and releasing the excess to the FCC for auction.

FirstNet’s operator is free to sell services to the private sector after it has met its service obligation to government. That should be the normal way government agencies access wireless systems.

The first question that should be put to Administrator Davidson should be “How can we reduce aviation’s reliance on mid-band spectrum to a level that’s not absolutely disgraceful?” That probably won’t happen Wednesday, but it is fundamental.

How We Got Here

US government agencies have never addressed technology in the holistic way the private sector does. This isn’t a lack of virtue, it’s built-in to an incentive structure. Aviation gets top marks when there are no 737 MAX fiascos, not when some new app pops up on mobile devices.

So government’s use of technology has always been driven by the desire of each agency to perform each task without making headlines. The private sector looks at aviation’s 800 MHz sounding rope and asks “what could I do with a quarter of that?”

I think that’s the better question. When lawmakers put aviation on a diet perhaps agencies will begin asking the better questions themselves.

Attitudes of Scarcity Lead to Better Results

The Western United States is running out of water. The worst drought in 1,200 years is ravaging the region and there are no signs of remission.

The only way through this is get better at managing and using water than we have been. While RF spectrum isn’t in a similar crisis yet, it’s wise to prepare for an eventuality where demand far outstrips supply. Spectrum, like water, is a finite resource at each point in time even if both are reusable.

For the past twenty years, tech policy wags such as Larry Lessig have touted the virtues of a new normal based on the assumption of abundance.

My position has always been that we should regulate as lightly as we can to get a network where the business model of network owners is abundance, not scarcity. That means that network owners aren’t pricing access and striking exclusive deals with content providers with the purpose of exploiting (and hence profiting from) scarcity.

So as I said at the F.C.C. hearing (and two years before during at least three events in Washington, D.C.), my judgment is that a ban on discriminatory access is all that it necessary to achieve this objective.

This notion is based in expectations that Moore’s Law would last forever, which it probably won’t. We had better use its last years to get a head start on the more efficacious uses of spectrum that may come about in the future.

What Can NTIA Do to Help?

Most of Congress’s priorities for NTIA – and NTIA’s own priorities – are sound. Closing the Digital Divide would be a good thing, as would better safety and security for Internet users and holding China’s international ambitions in check.

But we’re not going to improve digital inclusion by building more networks and we’re not going to balance spectrum use in the future by sending more people to meetings.

We need lawmakers to set ambitious goals for the growth of the tech sector and for government efficiency. This planet will soon be home to 10-12 billion people and we’re not manufacturing more of the finite resources they’re all going to need to live well.

With such a goal in mind, tinkering with the roles, responsibilities, incentives, and calendars of government agencies may become a bit more tractable. Let’s start asking NTIA to be the National Technology Innovation Administration.

The post The National Technology Innovation Administration appeared first on High Tech Forum.

Eric Schmidt’s Spectrum Agenda

$
0
0

Each change in presidential administrations reanimates old, rejected ideas in technology policy. While tech policy was once largely bipartisan, today it’s a bitter battlefield where basic facts are disputed and pathways to common goals are contested.

Case in point is the spectrum allocation policy positions championed by former Google front man Eric Schmidt. During the Obama Administration, Schmidt and his Microsoft counterpart Craig Mundie assembled a cast of semi-technicals – lawyers, policy advocates, public relations experts, investors, and academics – to put their names on a dysfunctional spectrum management plan.

The Schmidt/Mundie plan was issued in 2012 by the President’s Council of Advisors on Science and Technology as a Report to the President bearing the ponderous title: “Realizing the Full Potential of Government-Held Spectrum to Spur Economic Growth.” I was not thrilled with the PCAST report at its inception, nor with the 2019 follow up by Schmidt’s minion Milo Medin for the Defense Innovation Board. They haven’t stood up well.

The Dysfunctional PCAST Report

President Obama originally charged PCAST to develop a plan for releasing government spectrum rights to the private sector where they could be used to benefit the public. Instead of delivering such a plan, the PCAST report and its successors produced flimsy excuses for continuing to allow the government – especially the Defense Department – to hamstring wireless innovation by starving the private sector of spectrum rights:

PCAST finds that clearing and reallocation of Federal spectrum is not a sustainable basis for spectrum policy due to the high cost, lengthy time to implement, and disruption to the Federal mission. Further, although some have proclaimed that clearing and reallocation will result in significant net revenue to the government, we do not anticipate that will be the case for Federal spectrum. (PCAST report at vi.)

I addressed the claim that the well established practice of upgrading (or sunsetting) legacy systems to reduce their spectrum footprints and selling off the excess as flexible use licenses was “unsustainable” in a paper on the “upgrade-and-repack” practice I presented at the TPRC conference in 2013. In essence, spectrum rights transfers are as sustainable as is innovation itself. Innovation is where spectrum comes from.

The claim that reassigning spectrum rights takes too long – decades in the PCAST report’s estimation – is contradicted by reality. The first phase of the C-Band reallocation (from GEO satellites to 5G,) took less than two years from FCC Report and Order to deployment.

SES C-Band Timeline

The major stumbling block to C-Band reallocation was the government itself, with FAA raising last minute objections. Meanwhile, the CBRS system based on the PCAST model has proved to be nothing more than a solution in search of a problem.

Dysfunction Loves Dysfunction

While commercial operators regard PCAST sharing as impractical, DoD continues to pretend it’s beautiful. Writing for the Defense Innovation Board on 5G, Google’s Medin parrots his PCAST report’s (he’s a signatory)  sustainability (“…status quo of spectrum allocation is unsustainable”) and delay claims:

The average time it takes to “clear” spectrum (relocate existing users and systems to other parts of the spectrum) and then release it to the civil sector, either through auction, direct assignment, or other methods, is typically upwards of ~10 years.  (DIB  5G Study at 10.)

We now know this estimate assumes zero cooperation on the part of the vacating party. It’s a question of motivation as the satellite operators were compensated for speedy relocation.

To its credit, the DIB 5G study also pans CBRS:

There is precedent for successful spectrum-sharing – in 2010, the FCC opened up the 3550-3700 MHz bandwidth (known as Citizens Broadband Radio Service, or CBRS) to the commercial sector. However, this process took more than five years, a timeframe that is untenable in the current competitive environment. (DIB 5G Study at 11.)

Medin’s alternative to upgrade-and-repack and CBRS is even worse, however. It wants 5G operators to share a single network:

DoD should encourage other government agencies to incentivize industry to adopt a common 5G network for sub-6 deployment. Incentives can include: accelerated depreciation, tax incentives, low interest loans and government purchase of equipment and services. (DIB 5G Study at 28.)

Medin’s justification – improved security – doesn’t follow from experience. If attackers can focus on a single network, their job becomes much easier than attacking a dozen or so. Redundancy is a key element of reliability.

Inventing New Arguments for Old Ideas

Oblivious to the judgments of history, Schmidt still hawks his spectrum policy pink elephants on the pages of the Wall Street Journal and through venues controlled by his Schmidt Futures investment fund. The only change is the replacement of the discredited PCAST claims of sustainability and delay with even more outlandish claims about performance:

AT&T’s and Verizon’s new 5G networks are often significantly slower than the 4G networks they replace…America’s average 5G mobile internet speed is roughly 75 megabits per second, which is abysmal. In China’s urban centers 5G phones get average speeds of 300 megabits per second.

These nonsensical claims aren’t supported by impartial observations. Open Signal says US 5G users see download speeds in the 200 Mbps range. Our coverage is vastly better than China’s; the China card is the last refuge of scoundrels.

Open Signal Midband 5G Assessment

What is Schmidt’s Agenda?

Eric Schmidt and people close to him have been bashing US spectrum policy for ten years. It’s not clear to me why they’re doing this.

Schmidt is not a spectrum engineer, neither schooled nor self-taught. After embracing a whole new architecture for spectrum sharing in the 2012 PCAST report, Schmidt has switched to a “government first” approach that seeks to model the US economy after China.

The PCAST sharing model was never going to work, but Schmidt appears to have backed it with full faith and passion despite a complete lack of evidence. I think we can say the same thing about his current “lets’s be like China” model. China is actually lagging the US on all the important dimensions of 5G deployment.

A Reliable Spectrum Pipeline

The only consistency here is the lack of consistency – and a lack of study. Instead of chasing a series of shiny objects the US needs a predictable and reliable practice for transferring spectrum rights among and between old and new applications.

The US needs to create a system that keeps spectrum licenses in circulation, like dollars in the economy. Every technical system that uses spectrum today will be obsolete some day.

The spectrum pipeline acknowledges that fact and leverages it to make spectrum licenses available to the technical systems of tomorrow. I’ll dig deeper into that subject in forthcoming posts.

The post Eric Schmidt’s Spectrum Agenda appeared first on High Tech Forum.

Jayne Stancavage on the Global Spectrum Pipeline

$
0
0

[powerpress]

Jayne Stancavage, the Global Executive Director of Product and Digital Infrastructure Policy at Intel Corporation fills us in on the global and local systems for spectrum allocation.

She also chairs the Federal Communication Commission’s (FCC) WRC-23 Advisory Committee Informal Working Group for terrestrial services, serves as a member of the Department of State’s International Digital Economy and Telecommunication Advisory Committee, serves on the Board of Directors for the Open RAN Policy Coalition and the US Telecom Training Institute and is a member of the Global mobile Suppliers Association (GSA) Spectrum Group management team.

We’re very lucky that she had time for an extensive discussion of the use cases, management systems, and policy considerations around spectrum-based systems around the world.

Highlights

  • How we make spectrum available for new use cases and applications: Where bands are currently in use, legacy systems are upgraded to use smaller footprints and excess is either auctioned for licensed use, made available for free for unlicensed use, or made available under a hybrid model with license for priority use with free access at other times. New spectrum is harnessed by new technologies made available by advances in semiconductor (“chips”) engineering. Timecode 2:42.
  • National broadband plans: The US National Broadband Plan of 2010 was the first such plan of the iPhone era. Consequently, it broke new ground in addressing spectrum reassignment. Current discussion in Washington DC suggests the time is ripe for a national spectrum plan to keep the progress going. Timecode 17:18
  • Prospects for the future: It’s incredibly hard to predict what new applications will bring to the fore for radio-based systems. The only certainly is more and better use cases and applications. While today’s applications are human-centric, the next generation of wireless systems are likely to transform entire economic sectors. One good example is the combination of wireless, cameras, and AI to improve manufacturing processes and health care. Timecode 34:51

Index

  • Repurposing spectrum from aging legacy applications to higher uses. 2:42
  • The key to higher uses is “upgrade and repack”. 5:57
  • Technology innovation in chips, systems, and software makes new spectrum bands available. 7:47
  • Moore’s Law plays out in communications as well as in computation. 10:13
  • WRC-23 is set to amend and improve the international treaty for satellite and terrestrial radio uses. 12:20
  • Nothing illustrates the importance of international interoperability like travel. 15:34
  • National broadband plans now include important provisions for wireless systems. 17:18
  • Nothing restores confidence in the ability of Congress to work on a bipartisan basis like spectrum policy. 25:06
  • CBRS is a novel system that combines licensed and unlicensed uses into a single framework. It’s complicated. 29:12
  • Prospects for the future of wireless: it’s going to be stunning! 34:51
  • Is Moore’s Law a conspiracy to create demand for spectrum? It’s plenty smart in any case. 42:54

Enjoy!

 

The post Jayne Stancavage on the Global Spectrum Pipeline appeared first on High Tech Forum.


Spectrum Policy is Too Politicized

$
0
0

Congress is struggling to find a way forward on spectrum licensing policy. The spectrum pipeline is bare, the FCC’s auction authority is on life support, and the national spectrum strategy NTIA and the FCC are supposed to be developing is a long way from completion.

While Congress stands still, markets for wireless-enabled services continue to grow, generating trillions per decade in economic value from licensed networks alone. On the unlicensed side, Bluetooth and Wi-Fi are ubiquitous, even if their economic value is harder to quantify; these services likely contribute hundreds of billions if not trillions as well.

The last FCC was very generous to Wi-Fi: Its 6 GHz order provided 1.2GHz, tripling the amount of unlicensed spectrum Wi-Fi can use in the US. The regulator has been much more parsimonious with licensed spectrum. In the 5G and 6G C-band sweet spot, the FCC relied on voluntary surrender of 300 MHz by incumbent GEO satellite operators moving into the slot between the C-Band and the radio altimeter band.

What Comes Next?

The incentive auction freeing up the C-Band generated $81 billion all by itself, making it the most expensive (in terms of dollars per Mhz) spectrum auction ever.  Demand for licensed spectrum is clearly high. Meanwhile, consumers are moving into the 6 GHz Wi-Fi band at a snail’s pace as it provides little noticeable improvement to non-gamers per our testing. Device manufacturers are shifting to 6 GHz chipsets, but consumers are not upgrading routers.

Notwithstanding the high demand for licensed spectrum and the imbalance in recent allocations, some in Congress (chiefly on the Democratic side) would like spectrum auctions to simply stop. This faction responds to demands from incumbents with heavy investments in legacy tech (such as DoD,) cable companies, populist policy wags, and visionary technologists who see a future in which radio interference is a solved problem.

There’s a lot to argue about, and all stakeholders have interests, perceived or legitimate, that will have to be resolved for the nation as a whole to progress. It’s important for lawmakers to enter this debate with an interest in learning as well as freedom from emotional baggage that generates bias against particular stakeholders.

 How about a Little Sis-Boom-Bah?

As veteran telecom reporter John Hendel tells it, bias in favor of Wi-Fi plays a big part of the in the current spectrum debate (as if Wi-Fi were in danger):

As a policy debate, the fight to prioritize Wi-Fi and other unlicensed use of the airwaves is central to today’s spectrum arguments in Washington, and their outcome could dictate the next several years of U.S. tech policy. Congress is negotiating a package of spectrum legislation eyeing these issues, still largely in flux but pegged to the reauthorization of FCC spectrum powers set to expire in March.

And this debate is likely to be central to the Biden administration’s promised National Spectrum Strategy, which could come this year, and set goals for how airwaves get into the hands of the private sector.

The feeling that Wi-Fi is the mother of innovation reflects the relentless and very successful public relations campaign executed by Big Tech and its minions. Hendel summarizes the campaign’s key claim:

The whole thing shakes out — broadly — into an argument between the big telecoms that want more spectrum to carry their cell signals, and the consumer-tech and cable companies who want to be sure there’s a wide-open playing field for device innovation.

[my emphasis] So Big Telecom wants the airwaves for their very own cell signals, but scrappy little Wi-Fi carries the entire innovation mantle for all of us. This framing ignores or devalues practically every app on our smartphones.

Clearly, no one technology has a monopoly on innovation. In the real world we have networks inside our homes and offices and outside them. Wi-Fi can’t do what 5/6G can do outdoors even with all the help enthusiastic lawmakers can give it; we need both licensed and unlicensed spectrum to enjoy the full scope of innovation. How can anyone in their right mind think otherwise?

Where Does Internet Anti-Telecom Bias Come From?

The bias against telecom in the Internet space has a long history. The oldest story about the battle between Internet innovation and the phone company is the Internet’s own creation myth.

The myth pits computer people – the so-called “Netheads” – against telecom people, the “Bellheads.” It comes from a demo of Internet precursor ARPANET given by Ethernet inventor Bob Metcalfe to AT&T executives. It didn’t go well:

“I’m sitting at a terminal,” recalls Metcalfe, “this graduate student with a huge red bushy beard, giving a tour of this network to 10 executives from AT&T, all of whom were wearing pinstripe suits…and in the middle of my demo – for the one time in the whole three days – the system crashed. And I looked up. And they…were happy…that it crashed. They were smiling.” Metcalfe is still incredulous. “This was my life’s work. My crusade. And these guys were happy that it didn’t work.”

The takeaway? “I saw that there are people who will connive against innovation,” he says. “They’re hostile to it. And that has shaped my behavior ever since.” Metcalfe’s is a world of good guys and bad guys, and since that day in 1972, noncompetitive telcos have been planted firmly in the latter category.

Practically every time I’ve met an Internet person of Metcalfe’s generation the conversation veers toward the wickedness of “the powers that be” within seconds. When I read “a wide-open playing field for device innovation” Metcalfe’s tale always comes to mind.

Good Guys and Bad Guys

Even though the AT&T of 1972 isn’t today’s phone company and the ARPANET of that era predates TCP/IP, the tendency to label one group good and the other bad will always be the lazy person’s route to resolving complex problems.

This naïve framing drove the net neutrality controversy created by law professor Tim Wu in 2002. At a Silicon Flatirons event, Wu argued that broadband providers needed to be heavily regulated else they prevent Silicon Valley entrepreneurs from achieving their full potential for greatness.

The title of Wu’s paper “Network Neutrality, Broadband Discrimination” tells you who the good guys and bad guys are. Twenty years later we still don’t have a network neutrality law in the US and the broadband plant is the least of the Internet’s worries. The framing is naïve.

The Dualist Narrative in Spectrum Policy

While none of Wu’s dire predictions panned out, he is nevertheless a highly respected figure in Internet policy, publishing several books, securing tenure at Columbia Law, and serving as the White House chief tech policy guru. The politicization of spectrum policy appears to be one of his legacies.

It doesn’t have to be this way. Spectrum licensing enables a range of services that unlicensed can’t provide, such as high performance wide area connectivity, GPS, satellite broadband, and microwave backhaul. And unlicensed spectrum provides a range of services that don’t benefit from licensing, such as Bluetooth headsets and Wi-Fi appliances.

In a rational world, licensed and unlicensed must coexist, as they do when your Wi-Fi session at Starbucks determines its location via GPS or hands off its data to a microwave backhaul link. Getting from either/or to both/and has to be among the the key goals of the national spectrum strategy. If either one must be prioritized, we need to look at the objective benefits.

Efficiency is Enhanced by “Licensed First”

In the beginning, the wireless part of the Internet was designed to work around licensed spectrum allocations.  The first example was the Bay Area Packet Radio network debuted by Don Nielson’s team in 1976. It sought to avoid interference with licensed systems by using spread spectrum technology.

Wi-Fi originally followed this path, using spread spectrum in a junk spectrum band used by consumer microwave ovens. Wi-Fi was designed to tolerate interference on the reception side and to minimize it on the transmission side by using low power over short distances. Licensed systems generally use higher power in order to cover miles, while Wi-Fi only promises to reach 200-300 feet.

CBRS extends the concept of licensed first by offering licenses to new spectrum when and where they are needed, while allowing unlicensed systems to scavenge unclaimed spectrum for free. Unlicensed simply finds a band (or combination of bands) of spectrum sufficient to get the job done that isn’t being used by a license holder. Where the market demands licensed services, they can also be provided.

This system of priorities promotes efficiency and flexibility. Under the best of circumstances, Wi-Fi is only able to utilize spectrum half the time because the protocol waits for the network to be idle before transmitting. Licensed systems manage spectrum by schedule, and can use 95% of available capacity.

How Innovation Really Works in Broadband

The history of innovation in the networking space is rife with examples of great Nethead ideas that didn’t come to fruition until Bellheads took care of the deeply technical engineering details. The original standard for Metcalfe’s Ethernet – the Blue Book devised by a committee of computer people from DEC, Intel, and Xerox – called for attaching every computer to a shared coaxial cable and implementing an elaborate protocol (“carrier-sensing multiple access with collision detection and randomized binary exponential backoff“) to determine who got to transmit.

The cost of the coaxial cable and the electronics attached to it made potential users wary of installing Blue Book Ethernet. The IEEE 802.3 standards body accepted it anyway after adding its own set of options. When cost became a barrier to adoption, 802.3 chartered a task force to make it cheaper [I was the vice chair of this task force, known as the StarLAN task force within 802.3 and later a Wi-Fi contributor.]

The key to lowering the cost of installation was to ditch the shared cable and the delicate protocol in favor of a shared piece of electronics – a hub or switch – with individual telephone-style wires between the hub and the computer. This approach allowed multiple kinds of cabling, multiple speeds, prioritization, and fault isolation. Hub and spoke, as it’s called, is not just cheaper, it’s better and more scalable than a shared cable. Most of the ideas for this new approach to Ethernet came from people working for AT&T divisions or marketing to them.

Something similar happened with Wi-Fi. Some of the people from the StarLAN task force contributed ideas for the organization of the 802.11 network and its access protocols that built on StarLAN work. This is why we have Wi-Fi routers instead of a Blue Book-like network of peers.

Who Do You Trust?

When I say the early trust of Internet-native businesses and distrust of telecoms was naïve, I’m comparing the size and reputation of Big Tech firms today vs. their charmed status when the network neutrality drama started in the early naughts. In those days, Google, Yahoo, Netflix, and Amazon were scrappy little startups and Twitter and Facebook didn’t exist.

The telecom world was mainly highly concentrated wireline telephone and broadband service and cellular was a novelty. It was rational to fear the large monopolies and to promote the development of the innovators in competitive markets. But the tide has turned dramatically since 2002.

Float the idea of network services provided by Facebook to your friends and enjoy the jokes about privacy. Speculate about terms of use for broadband provided by Amazon and wait for the chuckles. Compare the evolution of cellular across its generations to the decay of Twitter and tell me who’s moving forward and who’s headed for extinction.

The Black Hole of Unlicensed

Once the FCC designates a spectrum band for unlicensed use there’s no turning back. Unlicensed means anyone can use it for anything at any time with little regard for infrastructure investment, reliability, or efficiency. Flexible use licenses are just the opposite, incentivizing their holders to do more with less.

It’s no wonder that the Wi-Fi industry’s next step after the FCC’s gift of 1200 MHz of spectrum was to ask for even more spectrum in the 7GHz band. We’re in the early days of 6 GHz Wi-Fi and its success remains to be proved. There are more critical, demonstrated needs for 7 and 12 GHz. While 6 GHz Wi-Fi is utterly uncongested and little used, we do see congestion on residential 5G networks.

T-Mobile essentially pioneered this service in the US, and they’ve suspended subscriptions in some areas because of network overload. If spectrum policy tracked demand, residential 5G would be the priority. And Wi-Fi is a non-starter for residential broadband in normal suburban neighborhoods even if it works great on 15 acre hobby ranches in central Texas.

Set Network Innovation Free

From the standpoint of increased network efficiency and resilience, Wi-Fi is a weak player. For all its cost and complexity, licensing does a better job of matching supply and demand than the model that takes licenses off the table in order to subsidize inefficient networks. Flexible use licenses perform extremely well.

Wi-Fi and licensed 3GPP (5G and 6G) have a symbiotic relationship in which advances in signal processing, modulation, encoding, and spectrum reuse are shared between the two. Chipset producers benefit from learnings from the other side the licensing divide, the rapid revision of Wi-Fi standards, and the perfection of novel concepts in licensed systems in which devices and infrastructure are well integrated.

Starving one side of this divide to nurture the other can only produce dire results for innovation in network design and technology development. The worst thing the US can do to spectrum policy would be to import political partisanship, grievance, and score-settling into the process of policy development. Congress needs to rise above these divisive tendencies to craft sound spectrum policy.

Spectrum policy also needs to be guided by the realities of network engineering rather than the desires of network incumbents to protect legacy business models, legacy reputations, and legacy technologies from upstarts. Spectrum policy need not be an exercise in exclusion, it must focus on enhancing the vibrant exchange of ideas among all players in the innovation ecosystem.

The post Spectrum Policy is Too Politicized appeared first on High Tech Forum.

Correcting the FCC’s 6 GHz Blunder

$
0
0

A new economic report by Raul Katz for WifiForward appears to inadvertently make a case for switching from Wi-Fi to Ethernet for better performance and a resulting boost to the US economy. The analysis is complicated, requiring us to connect some dots that Katz doesn’t want to connect, but the implications are clear. Let’s start with some background.

The FCC made two major spectrum allocation blunders in recent times: allocating 1,200 MHz to low power unlicensed across 6 GHz and putting low-power CBRS between two high power bands. Both errors are under review by the agency.

Cable industry front groups Spectrum for the Future and WifiForward are in high dudgeon over the reviews because they know the decisions were wrong and can’t stand up to scrutiny. There’s nothing inherently wrong with helping Wi-Fi along or with experimenting on novel spectrum management models, of course, but these cases both have significant flaws.

Both 6 GHz Wi-Fi and CBRS rely on spectrum management agents – SAS for CBRS and AFC for 6 GHz – designed by the Wireless Innovation Forum that respond to queries from registered and authenticated users for information about available channels and their permitted power levels. This is good, a feature that Wi-Fi should have had from the beginning. I have no quarrel with SAS and AFC, but technology only gets us so far with spectrum policy.

The Economics Don’t Add Up

The front groups are minimally interested in what’s good for the private 5G networks enabled by CBRS and the performance gains achieved by Wi-Fi 7 users as such. Rather, they appear to regard these initiatives as means to deprive 5G Fixed Wireless Access (FWA) of spectrum that would enable faster rollout of competitive broadband services.

Cable modem is losing market share to FWA and that’s not good for cable company share prices. WifiForward’s response to the cable modem crisis includes funding for a pair of quasi-economic studies by Raul Katz’s Telecom Advisory Services, Assessing the economic value of Wi-Fi in the United States (2024) and Economic Loss If the Top 700 Megahertz of the 6 GHz Band is Repurposed for Licensed Use (2025.)

Both studies rely on an economic model that posits a precise, linear causal relationship between the download speed and latency of broadband services and GDP per capita [Appendix B of Assessing]:

• A 10% increase in download speed results in a 0.1956% increase in GDP per capita

• A 10% decrease in latency results in a 0.4503% increase in GDP per capita

Katz pulls speed and latency figures out of Speedtest, applies them to Wi-Fi 7, and claims Wi-Fi offload of mobile device traffic automatically gooses GDP regardless of application requirements or any other trivial details. In fact, the biggest consumer of wireless traffic is video streaming, an application whose speed is throttled by the source (Netflix, Amazon, etc.) to conserve server resources.

How Video Streaming Actually Works

Higher resolution means more CPU cycles and storage accesses, both of which cost money. Hence, video streaming services compress content. Here are YouTube’s recommended broadband speeds per format:

Video Resolution Recommended sustained speed
4K UHD20 Mbps
HD 1080p5 Mbps
HD 720p2.5 Mbps
SD 480p1.1 Mbps
SD 360p0.7 Mbps

Hence, increasing the speed of your broadband offering from 50 to 500 Mbps or 5 Gbps has no impact on your streaming experience or the economy. All that matters is that you’re able to meet the minimum requirement, even if watching video in ultra-high resolution actually made you more productive. For this application, connection capacities above 20 Mbps per device go to waste.

Broadband advocates of many stripes have been trying to connect broadband adoption and speed with GDP growth for a very long time, but the intuitive premise remains unmeasurable at the high end of the speed scale even though it feels plausible. Adoption matters and minimum acceptable speed matters, but there is no incremental benefit at the extreme high end of the bandwidth spectrum except for very unusual applications.

Unclear Causality

The relationship between broadband quality and economic output is certainly non-linear.  Something is massively greater than nothing, but having the world’s highest performance doesn’t increase output that wasn’t constrained by poor broadband in the first place.

Moreover, the direction of the causality between income and broadband speed is ambiguous at best. As the World Bank puts it:

Broadband penetration is generally modeled as a function of price and income in the demand equation but is awkwardly linked to the output equation to account for causality. Other models estimate broadband penetration as a function of other factors and use that as the independent variable in an effort to weaken the correlation between income and broadband penetration. Neither is satisfactory and while econometric models are useful tools for estimating relationships, they cannot prove causation. [emphasis added]

But this sentiment doesn’t keep advocates and lobbyists from trying to establish a set of facts favorable to their clients even if their details are squishy.

Reacting to Wi-Fi Slowdown

Setting aside our quibble with the false precision of the Katz GDP estimator, it is a truth universally acknowledged that fast networks are better than slow ones. While Economic Loss If the Top 700 Megahertz of the 6 GHz Band is Repurposed sees nothing but horror coming from a narrower 6 GHz Wi-Fi band, reality tends to be more complex than economic modeling.

Assuming that Wi-Fi will slow down in the future (despite the best efforts of wireless engineers to make it more efficient,) are we justified in assuming that the entire broadband experience for consumers and enterprises will automatically suffer? This could be the case for applications that absolutely depend on Wi-Fi, but such applications are not nearly as prevalent as Wi-Fi connections used where they’re not really needed.

Dean Bubley, a favorite analyst of the “shared” spectrum crowd, offered a sober analysis of the most intense data consuming app for Wi-Fi in a LinkedIn post:

However, being even-handed, I also need to call out a growing habit in Wi-Fi that needs to change: continually referencing its use for 90% of consumer Internet traffic.

Here, the “yes but” is the huge role of Wi-Fi for connecting large-screen TVs in the home for streaming, which I believe accounts for a large % of that impressive figure. An hour of Netflix or YouTube in the living room can mean 1-4GB of data. Of a typical household’s 500GB per month, a significant chunk is just streamed video.

Often, TVs are positioned very close to the home gateway or WiFi router, and to be honest that could be “offloaded” with an Ethernet cable (remember those?). We rightly talk about offloading cellular traffic to Wi-Fi for efficiency – this is a similar concept.

In fact, all or nearly all Wi-Fi traffic is already offloaded to Ethernet at the Access Point. It’s dead simple to connect a TV set, AV receiver, or video streaming box to a nearby gateway router with one of the fabulous new light and flexible SlimRun Ethernet cables. Homes and enterprises with disappointing Wi-Fi performance do so today.

Less Reliance on Wi-Fi Means a Better Economy

The Katz analysis fails to recognize that Wi-Fi is an optional intermediary that is significantly overused today. It can’t reach the Internet until it offloads its traffic onto the local Ethernet for carriage to the cable modem or fiber optical network terminator that connects the premise to an ISP. [Quibble:  A possible exception is all-in-one Internet gateways that combine a modem, an Ethernet switch, and a Wi-Fi access point in a single box. But the internal circuits in these devices route Wi-Fi traffic through the Ethernet switch chip.]  

Because Wi-Fi offloads onto Ethernet, Wi-Fi traffic can never move faster than Ethernet. Choosing Wi-Fi over Ethernet is often a choice made out of convenience rather than necessity. If speed and latency are as important as Katz asserts, we’re better off using Ethernet without the intermediary where we can.

If we take Katz’s claim that high-speed local networks boost GDP at face value, then nothing is more patriotic than an extensive wired network inside every home and office.

Wi-Fi Takes a Big Toll on Quality

For illustrative purposes, I tested a Wi-Fi 7-native Acer Swiftgo 16 laptop and an iPhone 15 Pro Max over Wi-Fi and over Ethernet with both Speedtest and iPerf3. In general, Ethernet is slightly more than twice as fast as Wi-Fi with less latency to boot.

TestInterfaceDownloadUploadLatency
Speedtest
LaptopWi-Fi 72,562 Mbps1,643 Mbps5 ms
 Ethernet2,906 Mbps3,240 Mbps3 ms
iPhoneWi-Fi 6E1,420 Mbps1,230 Mbps4 ms
 Ethernet 3,120 Mbps2,590 Mbps3 ms
iPerf3
LaptopWi-Fi 71,790 Mbps1,800 MbpsND
 Ethernet4,530 Mbps4,530 MbpsND
iPhoneWi-Fi 6E1,427 Mbps1,004 MbpsND
 Ethernet3,483 Mbps1,647 MbpsND

The Internet connection under test is 3 Gbps symmetrical FTTH by Quantum Fiber/Century Link. [iPerf uses a local server, hence it doesn’t touch the Internet.] The Ethernet connection uses a plug-and-play UGREEN 5 Gbps USB C adapter.

Cost and Convenience of Ethernet Wiring

Contrary to Katz’s claim that USB Ethernet adapters require users to: “turnoff the Wi-Fi, refresh the Internet page and then access the Internet” they simply need to be connected to Ethernet and plugged into the device. Operating systems are smart enough to automatically choose the wired connection over the wireless one when both are available.

It may be useful to compare these figures with the analysis I did with both desktop and laptop computers over various generations of Wi-Fi from 4 – 7 for a picture of peak Wi-Fi performance with iPerf3. Pure Ethernet performance of desktop machines was ~9.36 Gbps, as fast as the adapters can handle, while Wi-Fi 7 downloads peaked at 3,000 Mbps on desktops and 2,140 Mbps on a laptop with native Wi-Fi 7.

Katz massively over-estimates the cost and difficulty of installing Ethernet cable, declaring the “national average for wiring a 2-room residence with CAT 6 is $660.” That citationless claim must include some hefty labor charges as contractors generally fetch $100 – 200 per room for wiring entire houses with Ethernet cable according to the Reddit Home Networking sub.  

There’s no Avoiding Ethernet in a Typical Home

Setting up a Wi-Fi network for a home with more than two rooms requires more than one Wi-Fi access point (AP); one per two rooms is a good rule of thumb. Each AP requires Ethernet backhaul for peak performance. Such homes can’t readily forgo Ethernet in any case.

Not only is Ethernet essential for Wi-Fi backhaul and useful for video streaming, it’s the most rational way to connect video cameras to a home network. Cameras need power as well as data transfer, which can be supplied by Power over Ethernet (PoE) with an appropriate switch.

PoE also provides power to the ceiling-mounted Wi-Fi access points in a well-designed home network as well as to satellite Ethernet switches that serve multiple devices room-by-room for as little as $30.

Conclusion

Networks are destined to be hybrids of wire and wireless devices for the foreseeable future. Ethernet is the true workhorse of residential and enterprise networks whether they also use Wi-Fi or not. The day will come when wireless is the connection of choice for most information processing devices, but that doesn’t mean we abandon wire altogether. Connecting North America to Europe and Asia is always going to be tricky.

Katz and similar advocates prey on the public’s lack of understanding of key technical concepts such as Wi-Fi, Ethernet, Internet Service, and economics. Some Zoomers call Internet Service “Wi-Fi” because they don’t know what Wi-Fi is made of. Katz himself is reluctant to admit that Wi-Fi is nothing without Ethernet tying the elements of Wi-Fi networks together.

The FCC’s historical concerns about Wi-Fi congestion stem from a lack of understanding of the role Wi-Fi should play in home and enterprise networks, all of which are hybrids. If we stick to using Wi-Fi to connect genuinely mobile devices and we continue to improve Wi-Fi radio performance, there will never be a need for Wi-Fi to use the upper 6 GHz; the development arc for unlicensed spectrum is in the higher frequencies that don’t generate as much co-channel interference as Wi-Fi generates today.   

Properly understood, Economic Loss If the Top 700 Megahertz of the 6 GHz Band is Repurposed for Licensed Use makes a strong case for increased reliance on wired networks simply because they’re faster, cheaper, more reliable, and more responsive than Wi-Fi. That much of the Katz study for WifiForward is valuable.

The post Correcting the FCC’s 6 GHz Blunder appeared first on High Tech Forum.