Communications Litigation Today was a Warren News publication.
FCC Efforts

Broadband Mapping May Take FCC Months Longer

As the FCC works on new broadband availability maps, experts said in recent interviews that building a nationwide map requires a focus on answering specific questions about connectivity. Some said it may be months before a final map is released.

The broadband data task force has moved at a “really quick speed,” task force Senior Counsel Sean Spivey told a precision agriculture task force meeting Thursday. He said the mapping task force is focused on developing systems and processes for better data collection, collecting more granular data from ISPs, and developing processes to verify and challenge data. Collection won't rely on the existing Form 477 filing platform, he said, and a new system will be built "from the ground up." The consumer portal for self-reporting received 12,000-plus submissions.

An FCC spokesperson said the contracting process is “proceeding at a very fast pace.” The agency couldn't begin solicitations until January and is now working with a data architect to design the data flow and other systems needed to manage data, the spokesperson said. The FCC adopted a recommendation from NCTA and others to require providers to submit data in the form of geospatial data known as shape files. The FCC received contract proposals by July 1 to build a location fabric database (this and shape files refer to the way maps capture information) and is “exploring ways to make improvements to the broadband deployment data the FCC currently collects,” they said: “Our goal is to create … a publicly accessible, data-based nationwide map of locations where broadband is truly available throughout the United States.”

Spivey said providers will be required to submit data either as polygons or as location lists that can be geocoded to the FCC broadband serviceable location fabric, which is a base layer map identifying precisely where broadband service is available. "Basically, everyone is singing from the same songbook," Spivey said, and the maps will be an "apples to apples comparison." Spivey said providers will be given a six-month notice before them having to provide new data. The FCC is "trying to move as fast as we can in the procurement for the fabric and the build of the new system." Mapping task force Chair Jean Kiddoo previously said the maps may not be ready until at least 2022 (see 2102170052).

It’s “probable” that a new map may not be ready until at least mid-2022, said ACA Connects Senior Vice President-Government Affairs Ross Lieberman. The key indicator that the process is moving forward will be when the Office of Economics and Analytics releases a public notice telling providers to submit data within six months, Lieberman said.

Accurate broadband mapping is “critical to the type of smart subsidy programs that will be needed to extend broadband to all Americans,” said an NCTA spokesperson: “We know that the FCC is working diligently to implement the requirements of the Broadband Data Act and cable operators fully support those efforts.”

ISPs will likely send the polygon data to the FCC by using parcels or building footprints, said Theorem Geo Director-Geospatial Technologies Will Reckling. The commission may be able to get granular data identifying connectivity for multi-unit buildings, Reckling said, but it would need to come from providers that have compiled that: “That effort would be so huge to do from scratch.”

The data needs to be accessible by whatever server is housing it, Reckling said, commonly an ArcGIS server or Azure portal. The issue then becomes how to visualize the map in a user-friendly way, he said. Data needs to be standardized because not all providers will list the raw data in the same format, Reckling said: “It might be download speed as the final database column name, but it's like DL speed in somebody else’s table or it has to be calculated.”

The FCC is expected to map out every U.S. broadband structure, and measuring connectivity there is “going to have a huge amount of error,” said Technology Policy Institute President Scott Wallsten. More granular data causes more error to be built into the data, Wallsten said: measuring broadband with sampling methods may produce a more accurate picture. Data collected through the Rural Digital Opportunity Fund Phase I auction challenge process to identify areas that are or will be covered has provided a “very sophisticated set of data,” he said, noting TPI is incorporating the data into its own maps.

Granularity and accuracy must be addressed, said NTCA Senior Vice President-Industry Affairs Mike Romano. “They’re not the same thing.” Having more granular data is “helpful, but not sufficient,” he said. To produce new maps, the FCC needs to find a vendor to develop a baseline fabric that shows all serviceable location structures, Romano said: The vendor will develop the fabric, and providers will report their data. Romano said it could take at least several months.

Senate Commerce Committee ranking member Roger Wicker, R-Miss., asked NTIA Friday “to reassess its data collection processes and sources, and use only the most up-to-date and accurate data.” NTIA’s new map, released in June (see 2106170053), “is a novel approach to the challenge,” but “I am concerned that this map is as inaccurate as previous federal maps,” Wicker wrote acting Administrator Evelyn Remaley. It “suffers from several major flaws,” including use of “outdated” data from the Census Bureau’s 2019 American Community Survey. The map “relies on the FCC’s census-block level availability data,” which “vastly overstates broadband coverage” and is in the process of being replaced by more granular information, said the letter: It “uses speed-test and usage data that can be affected by a number of variables,” including user equipment.

The NTIA map doesn’t provide insight into whether federal broadband programs like the emergency broadband benefit program are going to areas where low-income households are, Wallsten said. Some data in NTIA’s map may be proprietary information, he said, and it's a problem when a public institution has data that's not public.