Skip to content

documentation build issues on RTD #3697

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
keewis opened this issue Jan 15, 2020 · 8 comments
Closed

documentation build issues on RTD #3697

keewis opened this issue Jan 15, 2020 · 8 comments

Comments

@keewis
Copy link
Collaborator

keewis commented Jan 15, 2020

It seems we have (seemingly) random failures on RTD.

Some of these are the known memory issue: recreating my doc environment used about 1.4 GB of RAM, which might be too much for RTD, even with the extended memory.

Much more often is a timeout when building the docs but I can't reproduce them locally. Any ideas?
Edit: This really is random, I tried rerunning and the build passed.

Also, a warning:

proj_create: init=epsg:/init=IGNF: syntax not supported in non-PROJ4 emulation mode
@rabernat
Copy link
Contributor

Many projects have moved away from RTD for this reason. We can easily build the docs in travis and then use doctr to deploy them.

Is there a downside to this?

@keewis
Copy link
Collaborator Author

keewis commented Jan 15, 2020

not sure on downsides, but I think we could use this to provide a documentation preview for PRs?

@crusaderky
Copy link
Contributor

Very glad to upvote anything that rids us of the RTD CI!

@crusaderky
Copy link
Contributor

The obvious downside is that anybody with a link to one of the internal pages of our documentation will have the link broken. Also I'm unsure how straightforward it will be to rebuild all of our historical versions.

@keewis
Copy link
Collaborator Author

keewis commented Jan 28, 2020

it may be that the timeouts are not caused by RTD: I have been trying to build the documentation several times and it sometimes pauses while trying to read / build (?) dask.rst. This is the traceback if I cause a KeyboardInterrupt:

KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-4-2ef53683336b> in <module>
----> 1 ds.to_netcdf('manipulated-example-data.nc')

.../xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf)
   1543             unlimited_dims=unlimited_dims,
   1544	            compute=compute,
-> 1545		    invalid_netcdf=invalid_netcdf,
   1546		)
   1547

.../xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf)
   1095             return writer, store
   1096
-> 1097		writes = writer.sync(compute=compute)
   1098
   1099	       	if path_or_file is None:

.../xarray/backends/common.py in sync(self, compute)
    202			compute=compute,
    203                 flush=True,
--> 204			regions=self.regions,
    205             )
    206		    self.sources = []

.../lib/python3.7/site-packages/dask/array/core.py in store(sources, targets, lock, regions, compute, return_stored, **kwargs)
    921
    922         if compute:
--> 923             result.compute(**kwargs)
    924		    return None
    925		else:

.../lib/python3.7/site-packages/dask/base.py in	compute(self, **kwargs)
    163		dask.base.compute
    164		"""
--> 165         (result,) = compute(self, traverse=False, **kwargs)
    166		return result
    167

.../lib/python3.7/site-packages/dask/base.py in	compute(*args, **kwargs)
    434     keys = [x.__dask_keys__() for x in collections]
    435     postcomputes = [x.__dask_postcompute__() for x in collections]
--> 436     results = schedule(dsk, keys, **kwargs)
    437     return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
    438

.../lib/python3.7/site-packages/dask/threaded.py in get(dsk, result, cache, num_workers, pool, **kwargs)
     79		get_id=_thread_get_id,
     80		pack_exception=pack_exception,
---> 81         **kwargs
     82     )
     83

.../lib/python3.7/site-packages/dask/local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs)
    473             # Main loop, wait on tasks to finish, insert new ones
    474             while state["waiting"] or state["ready"] or state["running"]:
--> 475                 key, res_info, failed = queue_get(queue)
    476                 if failed:
    477                     exc, tb = loads(res_info)

.../lib/python3.7/site-packages/dask/local.py in queue_get(q)
    131
    132     def queue_get(q):
--> 133         return q.get()
    134
    135

.../lib/python3.7/queue.py in get(self, block, timeout)
    168             elif timeout is None:
    169                 while not self._qsize():
--> 170                     self.not_empty.wait()
    171             elif timeout < 0:
    172                 raise ValueError("'timeout' must be a non-negative number")

.../lib/python3.7/threading.py in wait(self, timeout)
    294         try:    # restore state no matter what (e.g., KeyboardInterrupt)
    295             if timeout is None:
--> 296                 waiter.acquire()
    297                 gotit = True
    298             else:

KeyboardInterrupt:

Is there something that could cause a dead lock?

@dcherian
Copy link
Contributor

Yeah I run into this occasionally.

This was referenced Jan 30, 2020
@fmaussion
Copy link
Member

Is there a downside to this?

Wouldn't we loose the possibility explore older version of the docs? Or is doctr also providing this service?

It seems so silly to have to reivent readthedocs just because of their CI...

@keewis
Copy link
Collaborator Author

keewis commented Feb 21, 2021

we didn't see this for quite some time, so I assume we can close this.

@keewis keewis closed this as completed Feb 21, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants