Hello,
I wonder if any one can help me diagnose one issue with a python script. Dask is used to read multiple files and call a xcape function , then save the outputs to netcdf files. The issue appears in the last step (output saving). The error message is super long, something like below :
"2022-09-02 08:41:22,615 - distributed.scheduler - ERROR - Couldn't gather keys {"('transpose-f46d5e92cf682836ca8a24eb17d8553b', 790, 0, 0)": ['tcp://10.12.206.60:36276'], "('transpose-8c88781f634a9927c24c1d3f3c8a4b12', 1195, 0, 0)": ['tcp://10.12.206.60:43674'], "('transpose-8c88781f634a9927c24c1d3f3c8a4b12', 2007, 0, 0)": ['tcp://10.12.206.39:44215'], "('transpose-8c88781f634a9927c24c1d3f3c8a4b12', 2286, 0, 0)": ....".
All workers will then be killed. But the data can be still written to netcdf files.
My script can be found at,
/glade/u/home/lantao/python/2022-08-31_ARISE/p_cal_ARISE_hist_dask_SRH03.ipynb
Thank you!
Last updated: May 16 2025 at 17:14 UTC