Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deadlock with trio, when streaming changes #195

Open
palkeo opened this issue May 21, 2020 · 0 comments
Open

Deadlock with trio, when streaming changes #195

palkeo opened this issue May 21, 2020 · 0 comments
Assignees
Labels
bug Something isn't working not qualified The issue is not checked yet by the owners

Comments

@palkeo
Copy link

palkeo commented May 21, 2020

Hi !

I think I found a deadlock in the library.
It happens when using trio, when iterating over the changes to a table. If the connexion is closed an exception will be raised, we will enter the aexit of the r.open() context manager and deadlock in there.

Here is the sequence of events:

  • exception emitted by the async for change in changes line (internally it's emitted by _get_next in net_trio.py).
  • exception propagated, so we enter AsyncTrioConnectionContextManager.__aexit__()
  • we enter ConnectionInstance.close()

Then it waits infinitely in the last line of this code:

        # We must not wait for the _reader_task if we got an exception, because that
        # means that we were called from it. Waiting would lead to a deadlock.
        if self._reader_ended_event:
            await self._reader_ended_event.wait()

I added logs and was able to confirm it blocks there indefinitely. And it matches the comments as well.

Here is how to reproduce:

    async def print_changes(self, table):
        async with trio.open_nursery() as nursery, r.open(host=self.host, db=self.db, port=self.port, nursery=nursery) as conn:
            changes = await r.table(table).changes().run(conn)
            async for change in changes:
                logging.info({'name': table, **change})

Now break the connection to rethinkdb, and you will see it will never raise an exception nor terminate.

Thank you!

@palkeo palkeo added bug Something isn't working not qualified The issue is not checked yet by the owners labels May 21, 2020
@gabor-boros gabor-boros added this to the Sprint #2 milestone May 22, 2020
@gabor-boros gabor-boros removed this from the Sprint #2 milestone Sep 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working not qualified The issue is not checked yet by the owners
Projects
None yet
Development

No branches or pull requests

2 participants