You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After reading-in USGS (or USACE) gage observations from TimeSlice files, we (resample and) interpolate through any gaps that are greater than a threshold size. This all happens in nhd_io.get_obs_from_timeslices. The interpolation is done with Pandas interpolate function, which conducts a 1d interpolation of each column in a DataFrame. For a CONUS-sized TimeSlices, there are >7500 columns in the DataFrame that needs to be interpolated. On a single CPU, this is showing to be a performance bottle neck and we need to make if faster, somehow.
# ---- Interpolate USGS observations to the input frequency (frequency_secs)observation_df_T=observation_df.transpose() # transpose, making time the indexobservation_df_T.index=pd.to_datetime(
observation_df_T.index, format="%Y-%m-%d_%H:%M:%S"# index variable as type datetime
)
# specify resampling frequency frequency=str(int(frequency_secs/60))+"min"# interpolate and resample frequencyobservation_df_T= (observation_df_T.resample('min').
interpolate(
limit=interpolation_limit,
limit_direction='both'
).
resample(frequency).
asfreq()
)
# re-transpose, making link the indexobservation_df_new=observation_df_T.transpose()
The text was updated successfully, but these errors were encountered:
After reading-in USGS (or USACE) gage observations from TimeSlice files, we (resample and) interpolate through any gaps that are greater than a threshold size. This all happens in
nhd_io.get_obs_from_timeslices
. The interpolation is done with Pandasinterpolate
function, which conducts a 1d interpolation of each column in a DataFrame. For a CONUS-sized TimeSlices, there are >7500 columns in the DataFrame that needs to be interpolated. On a single CPU, this is showing to be a performance bottle neck and we need to make if faster, somehow.The text was updated successfully, but these errors were encountered: