You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This files contains the year-specific information about various sites. This file was built externally (I think by Lindsay or David) and hosted on sciencebase because the act of building it took a very long time (it has to pull a ton of NWIS data from the services). I think we need to rebuild this file completely, or at least replace 2016-2019. Likely the whole thing since other earlier data in NWIS may have changed. The file looks like this:
This file contains only flattened (no time dimension) information about each site used in the map.
readRDS('cache/disch-sites.rds')
# A tibble: 22,274 x 4site_nohucdec_lat_vadec_long_va<chr><chr><dbl><dbl>1010100000146.7-69.72010100700146.9-69.83010105000147.1-69.14010110000147.1-69.15010125000147.2-68.66010125150146.7-68.87010125200146.7-68.88010125250146.7-68.8
It depends on cache/allSitesYears.csv, and is a filtered and summarized version of that file.
cache/site-map.rds
This file is a spatial object in R (sp, SpatialPointsDataFrame spatial object classes) which contains dot/site locations in "map space". The object has spatial information, and then only one attribute, which is the site_no. If we add new sites with the data update, which I expect we will, this object will need to be rebuilt. It relies on cache/disch-sites.rds and can be built with this function
target/data/year-data.json
This is the main time information dataset, but it is built in a compressed and almost unreadable way in order to make it much smaller. This JSON has the job of telling the map which dots to draw for a given year, but doing directly would make the json file huge (we have over one hundred years of site locations). Instead, it is a "diff" of losses and gains of sites, specified by gn for gain and ls for loss:
Sites are also grouped into X number of groups (currently 23) which each have 1000 or fewer sites. This file is built in cache/year-data.json and then moved to target/data/year-data.json in the publish phase. It depends on cache/site-map.rds and cache/allSitesYears.csv. You can build this file using this function
if successful, this file will add sites to the JSON fields for more recent years, e.g., "2018":{"gn":[],"ls":[454]},"2019":{"gn":[],"ls":[]}
cache/bar-data.xml
This xml file is injected into the DOM (as far as I know...) to be the "bars" at the bottom of the map. It relies on cache/state-map.rds (just to get the size of the SVG it seems) and cache/allSitesYears.csv so that it can get the year totals. note to team I noticed that different site filtering is done here, vs what is done in site-map and disch-sites to remove sites...this one simply sums them, so I think we get totals might include a few that aren't shown in the map. Flag this for later.
This seemed to build fine with updated time using this function
figures/states_map.svg
This is the big goofy/funky one, and is the largest departure from how we currently build maps. This is a large svg that includes all of the point locations, the state polygons, the bar chart, and the USGS watermark. This work is pre-D3 and pre-mapbox, so it builds an SVG in R using the mapping plot functions with an SVG export. It then "cleans" up the svg by moving things around, adding attributes, and getting rid of some elements. I don't think it is worth understanding what is going on in this function as it is a dead end path we were on years ago before we changed how we do things. Because this file combines all of the other elements, it relies on cache/state-map.rds, cache/site-map.rds, cache/watermark.rds, and cache/bar-data.xml. Of those, we haven't covered the state-map or the watermark, but both should remain unchanged from previous builds and I figured you could use the old files w/o worry.
This did manage to build for me with updated time (I didn't update the actual data, but the original csv actually has some 2016, 2017, and 2018 sites in it). I made some changes to this function to make that happen.
If successful, that function will create an unstylized SVG that has extra bars for the additional years added:
The text was updated successfully, but these errors were encountered:
cache/allSitesYears.csv
This files contains the year-specific information about various sites. This file was built externally (I think by Lindsay or David) and hosted on sciencebase because the act of building it took a very long time (it has to pull a ton of NWIS data from the services). I think we need to rebuild this file completely, or at least replace 2016-2019. Likely the whole thing since other earlier data in NWIS may have changed. The file looks like this:
cache/disch-sites.rds
This file contains only flattened (no time dimension) information about each site used in the map.
It depends on
cache/allSitesYears.csv
, and is a filtered and summarized version of that file.cache/site-map.rds
This file is a spatial object in R (
sp, SpatialPointsDataFrame
spatial object classes) which contains dot/site locations in "map space". The object has spatial information, and then only one attribute, which is thesite_no
. If we add new sites with the data update, which I expect we will, this object will need to be rebuilt. It relies oncache/disch-sites.rds
and can be built with this functiontarget/data/year-data.json
This is the main time information dataset, but it is built in a compressed and almost unreadable way in order to make it much smaller. This JSON has the job of telling the map which dots to draw for a given year, but doing directly would make the json file huge (we have over one hundred years of site locations). Instead, it is a "diff" of losses and gains of sites, specified by
gn
for gain andls
for loss:Sites are also grouped into X number of groups (currently 23) which each have 1000 or fewer sites. This file is built in
cache/year-data.json
and then moved totarget/data/year-data.json
in the publish phase. It depends oncache/site-map.rds
andcache/allSitesYears.csv
. You can build this file using this functionif successful, this file will add sites to the JSON fields for more recent years, e.g.,
"2018":{"gn":[],"ls":[454]},"2019":{"gn":[],"ls":[]}
cache/bar-data.xml
This xml file is injected into the DOM (as far as I know...) to be the "bars" at the bottom of the map. It relies on
cache/state-map.rds
(just to get the size of the SVG it seems) andcache/allSitesYears.csv
so that it can get the year totals. note to team I noticed that different site filtering is done here, vs what is done insite-map
anddisch-sites
to remove sites...this one simply sums them, so I think we get totals might include a few that aren't shown in the map. Flag this for later.This seemed to build fine with updated time using this function
figures/states_map.svg
This is the big goofy/funky one, and is the largest departure from how we currently build maps. This is a large svg that includes all of the point locations, the state polygons, the bar chart, and the USGS watermark. This work is pre-D3 and pre-mapbox, so it builds an SVG in R using the mapping plot functions with an SVG export. It then "cleans" up the svg by moving things around, adding attributes, and getting rid of some elements. I don't think it is worth understanding what is going on in this function as it is a dead end path we were on years ago before we changed how we do things. Because this file combines all of the other elements, it relies on
cache/state-map.rds
,cache/site-map.rds
,cache/watermark.rds
, andcache/bar-data.xml
. Of those, we haven't covered thestate-map
or thewatermark
, but both should remain unchanged from previous builds and I figured you could use the old files w/o worry.This did manage to build for me with updated time (I didn't update the actual data, but the original csv actually has some 2016, 2017, and 2018 sites in it). I made some changes to this function to make that happen.
If successful, that function will create an unstylized SVG that has extra bars for the additional years added:
The text was updated successfully, but these errors were encountered: