Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing elevation data in Berkeley #162

Closed
graue opened this issue Oct 8, 2024 · 4 comments
Closed

Missing elevation data in Berkeley #162

graue opened this issue Oct 8, 2024 · 4 comments
Assignees

Comments

@graue
Copy link

graue commented Oct 8, 2024

There's some missing elevation data in Berkeley in production (and staging). It's all just zero.

Example route

image

@rsarathy rsarathy self-assigned this Oct 8, 2024
@rsarathy
Copy link

Revisiting this now. I think that I've identified the root cause.

Here's the grayscale thumbnail for the elevation tile covering Berkeley that we're currently using:
image

The way that the national elevation dataset is sourced is that while each TIFF covers a 0.25x0.25 degree graticule, the actual elevation data within the TIFF may only cover a county, city, or a specific region. For this reason, USGS sometimes has multiple TIFFs available for a specific quarter-degree graticule.

In this specific instance, there is a different NED grayscale TIFF that should give use the berkeley elevation data. Its thumbnail looks like this:
image

The good news is that because this TIFF falls within our usgs ElevationProvider bounding box, we won't need to push any code changes to resolve this issue - just a new data artifact.

Testing

  • Check locally to ensure that we're seeing nonzero data values in Berkeley using USGSProvider.main()
  • Upload the new compressed TIFF to the staging min.io S3 bucket and trigger a graph-cache rebuild
  • Verify that staging has nonzero elevation data in Berkeley and ex-San Francisco areas
  • Upload compressed TIFF to production min.io S3 bucket and trigger a one-off graph-cache rebuild

@rsarathy
Copy link

I pushed a new copy of the ned19_n38x00_w122x50 compressed TIFF to the staging s3 bucket. It was picked up in the most recent graph-cache rebuild and the elevation data looks correct.

I've just done the same for prod, and the new elevation data should be visible in the next prod graph-cache rebuild.

@graue
Copy link
Author

graue commented Dec 20, 2024

Thank you for fixing this!

while each TIFF covers a 0.25x0.25 degree graticule, the actual elevation data within the TIFF may only cover a county, city, or a specific region.

Are there any other TIFFs we might be using that have this problem? How can we make sure there are not?

@rsarathy
Copy link

Are there any other TIFFs we might be using that have this problem? How can we make sure there are not?

I will manually double-check the TIFFs that constitute our 3x3 square of NED 1/9 AS elevation tiles to make sure that they cover the full area of their respective footprints.

When I was implementing changes to support the ask in #129, I was thinking about how much of our Bay Area OSM cutout we would be able to cover with the 1/9 arcsecond data. It was tedious to download each individual tile and to ensure that its footprint did in fact cover the full area of the quarter degree square (I would check their associated thumbnails). This wore on me a bit and I ended up settling with 9 tiles that formed a 3x3 square over the densest areas of San Francisco.

We still need to cover the rest of our Bay Area cutout with the 1/9 AS DEM, but doing so will take more time and effort on my part, and I will have to be much more cautious about downloading and converting the correct files for our needs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants