An Overview of the Push to Bring Next-Gen Broadband to Municipal Areas in the US

One of the takeaways from a conference on municipal broadband held at South by Southwest is that affordable access to high-speed internet is crucial for economic growth in the 21st century. While the United States has enjoyed steady and robust growth in broadband infrastructure for the better part of the last three decades, it has lagged considerably behind other countries.

American consumers are faced with higher prices, middling connectivity speeds, and far from ubiquitous distribution of broadband services. According to the FCC, 40% of Americans do not have access to 100 Mbps connection speeds, which is the sort of bandwidth that consumers will require moving forward into the 21st century.

One culprit for the relatively mediocre state of US broadband connectivity is the lack of competition, according to Robert Faris, a researcher affiliated with Harvard’s Berkman Center for Internet and Society. To bolster his claim, Faris cites figures released by the FCC that the majority of American consumers have a choice of only two internet service providers or less and a quarter of households have one broadband provider or less.

The primary cause of this lack of competition, Faris notes, is that “unlike European countries and a large majority of OECD countries, the US has abandoned policies that require the sharing of infrastructure with competing broadband providers. Instead, the US has taken a deregulatory approach that requires competitors to build their own infrastructure in order to enter the market.”

The fact that broadband providers are required to build their own infrastructure makes it an extremely costly proposition for a telco to build out fiber networks to neighborhoods or purchase dark fiber in order to handle the increased throughput requirements of faster connectivity, not to mention they have little incentive to upgrade systems in far-flung or less densely populated communities with poor broadband connectivity.

The Push for Municipal Broadband Networks

In this context, municipal broadband, in which communities opt to build their own broadband networks, has become something of a contentious and significant issue. Hundreds of cities including Rockport, Maine, Chanute, Kansas, and Powell, Wyoming have such projects underway since the FCC voted in February 2015 to preempt state laws in North Carolina and Tennessee that prevent municipal broadband providers from expanding outside their territories.  The Obama administration has come out firmly in favor of the FCC’s ruling, arguing that it addresses the infrastructural needs of communities that have not been adequately serviced by the market.

The cities that have opted to deploy their own municipal broadband face many significant challenges, including a very steep learning curve, high risks, competition from better funded and more experienced incumbent providers, not to mention extremely high capital costs. The Mayor of Seattle recently reiterated to residents that it would cost between $480 million and $664 million to build out a municipal broadband network across the city, which is far too high to be feasible.

Financing such a system would result in one of the greatest tax increases in the city, though certain interest groups continue to point out that 15% percent of Seattle homes do not have access to the Internet and that a municipal network could help bridge the digital divide. The results of municipal projects have been mixed in other cities as well. The Burlington Telecom project in Vermont and the municipal network in Provo, Utah, for instance ran into fiscal problems and were both sold off to private buyers.

That isn’t to say that the results have been dismal across the board. A case study conducted by the Berkman Center investigated the success Holyoke Gas & Electric Department, a municipally-owned electric utility, in providing internet access and competing with Comcast and Charter. It has enjoyed steady revenue growth, saves the city approximately $300,000 a year, and is forging agreements to extend its services to neighboring cities. While Holyoke Gas & Electric has primarily focused on providing business and larger institutions with telecom services, it has also begun considering residential FTTH offerings.

Google Fiber Buildouts in Metropolitan Areas

Another option that communities facing a faulty internet service have is to appeal to Google Fiber which partners with select cities to provide Gigabit Internet by building out fiber-optic networks mostly from scratch. It has so far expanded into 10 cities in the US, and has plans in place to expand into some other cities including Atlanta, Austin, Charlotte, Nashville, and San Antonio.

An important piece of the Google Fiber network architecture is the fiber hut, which shelters the communications equipment and is deployed throughout the city to facilitate service to customers. Given the zoning and municipal requirements of deploying and facilitating such a network architecture, among other considerations enumerated in Google Fiber’s application checklist, not every city’s application to receive Google Fiber has been approved. Most notably, Seattle’ application has been rejected due to its restrictive regulations and permit system, lack of available utility poles to attach fiber to, and the difficulty faced in getting city council to ratify a contract.

In other cases, Google Fiber’s expansion and service offerings have been limited and do not fully fulfill the needs of residents. In San Francisco, Google Fiber announced plans to use existing fiber to connect certain apartments and condos in order to bring faster internet service to residents and to connect some public and affordable housing properties for free. However, these plans only cover a few targeted apartment communities.

Given that only 2.6% of users in San Francisco have access to 1 Gbps speeds or higher, the city of San Francisco commissioned a study to estimate the costs of building its own municipal network and found the figure to fall somewhere in the range of $393.7 million to $867.3 million. The report also laid out three possible options for deploying such a network:

  1. A completely public model in which the City would manage the construction of a fiber network and also oversee the administrative and retail operations associated with acting as an Internet Service Provider.
  2. In a private sector model, the City would rely completely on the private sector to acquire gigabit speed internet access, though it could incentivize private sector companies to provide such service through measures including “relaxing construction regulations and permitting requirements pertaining to network construction, making City property more easily available to ISPs for their network facilities and equipment, and allowing ISPs to use existing public conduit.”
  3. A public-private partnership model in which the city would partner with one or more private sector entities in order to share the costs, operational burdens, and financial risks associated with constructing and operating a gigabit speed network.

San Francisco has yet to decide on a course of action, but the premise of the plan, which is to deploy a ubiquitous fiber to the premises (FTTP) network is ambitious and unprecedented, in that no city of its size has ever done so before.

Digiprove sealCopyright secured by Digiprove © 2016

3 thoughts on “An Overview of the Push to Bring Next-Gen Broadband to Municipal Areas in the US”

  1. Absolutely amazing that people don’t understand basic network effect principles.

    Value gravitates to the core and top of network stacks (or ecosystems of networks, such as the internet). Value tends to grow geometrically, but can grow exponential when other systems of mutually leveraged (wireless on top of wired, commerce on top of messaging, etc…).

    Cost is concentrated at the bottom and edge of network stacks.

    Without settlements there’s no way for value to be conveyed efficiently from core and top to bottom and edge. This is absolutely necessary for sustainable ecosystem development.

    Furthermore, without shared or open access at the edge (thank goodness for 802.11/wifi/part 15 which Steve Jobs used to resurrect equal access in 2007) the networks bog down into silos at the core and edge.

    Trying to solve a core or macro problem with some small relatively insignificant efforts (which I do not mean to denigrate) at the edges will not be effective and will only add to cost of service somewhere.

    • Thanks, I appreciate the feedback. I see what you mean when you say settlements address the core problem whereas municipal broadband is a band-aid for a gunshot wound whose record so far has been mixed at best. The observations you make on your blog re: Masayoshi/Sprint’s vision for competitive wireless in America were also thought-provoking and acute. It’ll definitely be interesting to continue to follow those developments and see if they come to fruition.

      • Settlements address the “edge” problem by conveying value from core to edge (managed VPN’s that purchase edge access), or the core driving edge demand uniformly. Note the latter word. There is no uniform and equal edge “lift” with the IP stack. We are living in a world of IPv4 after 25 years and almost all apps still run under 5mbs and the average “session” bandwidth is still measured in kbps.

        It’s the value to the Chicago trader of what the crops are doing in Iowa that will pay for access in rural markets. Likewise mobile is a great equilibrator; but not with flat rate, all you can eat, because marginal demand can’t be monetized effectively. Unless the transient user holds the service provider to a uniform standard. So coverage is why Verizon has kicked Sprint’s butt because it can justify higher prices and higher value (high transient) users. That goes for businesses as well.

        Funny story about Sprint. Only after they spent their billions did the Softbank folks jet in from Japan and realize the big coverage problems Sprint had with higher frequencies and backhaul distance in the US. They had no comprehension of our teledensity and geographic spread. Literally a 10-15% differential in margin.

        Shame, cause I brought them 8-8 MIMO P2MP (fixed) solutions in the early-mid 2000s for their Clearwire spectrum (which I also brought them in 1997) that would have carved out 50-60mhz for efficient backhaul to dense 1.9/2.1 cells. For all of Son’s vision, there simply hasn’t been the management execution to make it happen. At this point, doubt it will.

Leave a Comment