Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bidir streaming #209

Closed
wants to merge 40 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
4371a0a
Add initial bi-directional streaming
goodboy May 1, 2021
babe62a
Expose `@context` decorator at top level
goodboy May 2, 2021
b706cd9
Cancel scope on stream consumer completion
goodboy May 2, 2021
3625a8c
Add basic test set
goodboy May 2, 2021
ebf5315
Support passing `shield` at stream contruction
goodboy May 7, 2021
3f1adc0
Parametrize with async for style tests
goodboy May 7, 2021
68d600d
Fix typing
goodboy May 7, 2021
d59e9ed
Be more pedantic with error handling
goodboy May 10, 2021
be022b8
Use context for remote debugger locking
goodboy May 10, 2021
ddc6c85
Add dynamic pubsub test using new bidir stream apis
goodboy May 12, 2021
bebe26c
Expose msg stream types at top level
goodboy May 12, 2021
18135b4
Only send stop msg if not received from far end
goodboy May 12, 2021
20e73c5
Fix up var naming and typing
goodboy May 12, 2021
8017e55
Support no arg to `Context.started()` like trio
goodboy May 25, 2021
732b9fe
Add error case
goodboy Jun 10, 2021
a4a6df5
Only close recv chan if we get a ref
goodboy Jun 10, 2021
910df13
Avoid mutate on iterate race
goodboy Jun 10, 2021
b3437da
Add a multi-task streaming test
goodboy Jun 10, 2021
79c8b75
Add a specially handled `ContextCancelled` error
goodboy Jun 13, 2021
7069035
Expose streaming components at top level
goodboy Jun 13, 2021
f8e2d40
Specially raise a `ContextCancelled` for a task-context rpc
goodboy Jun 13, 2021
0af5852
Explicitly formalize context/streaming teardown
goodboy Jun 13, 2021
0083145
Fix exception typing
goodboy Jun 14, 2021
83c4b93
Wait for debugger lock task context termination
goodboy Jun 14, 2021
201392a
Adjustments for non-frozen context dataclass change
goodboy Jun 14, 2021
87f1af0
Drop trailing comma
goodboy Jun 14, 2021
f2b1ef3
Add detailed ``@tractor.context`` cancellation/termination tests
goodboy Jun 14, 2021
43ce533
Speedup the dynamic pubsub test
goodboy Jun 14, 2021
197d291
Modernize streaming tests
goodboy Jun 14, 2021
59c8f72
Don't clobber msg loop mem chan on rx stream close
goodboy Jun 14, 2021
288e2b5
Set stream "end of channel" after shielded check!
goodboy Jun 14, 2021
17dc6aa
Consider relaying context error via raised-in-scope-nursery task
goodboy Jun 24, 2021
ced5d42
Add some brief todo notes on idea of shielded breakpoint
goodboy Jun 27, 2021
627f107
Add temp warning msg for context cancel call
goodboy Jun 27, 2021
17fca76
First try: pack cancelled tracebacks and ship to caller
goodboy Jun 27, 2021
6f22ee8
Always shield cancel the caller on cancel-causing-errors, add teardow…
goodboy Jun 28, 2021
6e75913
De-densify some code
goodboy Jun 30, 2021
377b8c1
Add pre-stream open error conditions
goodboy Jun 30, 2021
8371621
Expect context cancelled when we cancel
goodboy Jun 30, 2021
e1533d3
Avoid mutate during interate error
goodboy Jun 30, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
498 changes: 498 additions & 0 deletions tests/test_2way.py

Large diffs are not rendered by default.

220 changes: 220 additions & 0 deletions tests/test_advanced_streaming.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,220 @@
"""
Advanced streaming patterns using bidirectional streams and contexts.

"""
import itertools
from typing import Set, Dict, List

import trio
import tractor


_registry: Dict[str, Set[tractor.ReceiveMsgStream]] = {
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Linking comment f9dd2ad#r50700925.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gc-ss mentioned,

… but if you don't like that, what if we make _registry an argument to publisher?

Yup can do that, I just didn't bc turns out the consumers can update via sending updates on their individual streams.

It would be perfect if publisher accepted an environment variable (or a Context?)

This can also be done though I'm not sure what you mean by environment. A Context isn't really required explicitly here but could be used to control cancellation if wanted.

For an alt to _registry we could do something like,

class PublisherState:
    subscriptions: dict[str, tractor.MsgStream] = {}

kinda thing?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure what you mean by environment. A Context isn't really required explicitly here but could be used to control cancellation if wanted

By environment or Context, I meant the context/environment/configuration due to which the function might behave in a non-deterministic manner but the values of which don't necessarily change between function invocations - like a remote host address:port or username/password.

In this case - think about two actors communicating with each other. The exact same actors (types) might behave in very different ways if their context/environment/configuration were different with everything else (eg: arguments) being the same.

Unlike arguments though, their context/environment/configuration dont change between function invocations (like arguments do).

For example, the registry does not need to change between function invocations.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, yeah a service actor can easily create wtv objects they need and mutate them as actor-local state.
They can go further and offer a seperate api to mutate that object/state from other actors if needed.

I'm just not sure we need to show that in this example since the point is just a super basic dynamic pubsub system.
We do have a pretty terrible example in the docs that could be improved.

Bonus points for sweet PRs 😉

'even': set(),
'odd': set(),
}


async def publisher(

seed: int = 0,

) -> None:

global _registry

def is_even(i):
return i % 2 == 0

for val in itertools.count(seed):

sub = 'even' if is_even(val) else 'odd'

for sub_stream in _registry[sub].copy():
await sub_stream.send(val)

# throttle send rate to ~1kHz
# making it readable to a human user
await trio.sleep(1/1000)


@tractor.context
async def subscribe(

ctx: tractor.Context,

) -> None:

global _registry

# syn caller
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/syn/sync

await ctx.started(None)

async with ctx.open_stream() as stream:

# update subs list as consumer requests
async for new_subs in stream:

new_subs = set(new_subs)
remove = new_subs - _registry.keys()

print(f'setting sub to {new_subs} for {ctx.chan.uid}')

# remove old subs
for sub in remove:
_registry[sub].remove(stream)

# add new subs for consumer
for sub in new_subs:
_registry[sub].add(stream)


async def consumer(

subs: List[str],

) -> None:

uid = tractor.current_actor().uid

async with tractor.wait_for_actor('publisher') as portal:
async with portal.open_context(subscribe) as (ctx, first):
async with ctx.open_stream() as stream:

# flip between the provided subs dynamically
if len(subs) > 1:

for sub in itertools.cycle(subs):
print(f'setting dynamic sub to {sub}')
await stream.send([sub])

count = 0
async for value in stream:
print(f'{uid} got: {value}')
if count > 5:
break
count += 1

else: # static sub

await stream.send(subs)
async for value in stream:
print(f'{uid} got: {value}')


def test_dynamic_pub_sub():

global _registry

from multiprocessing import cpu_count
cpus = cpu_count()

async def main():
async with tractor.open_nursery() as n:

# name of this actor will be same as target func
await n.run_in_actor(publisher)

for i, sub in zip(
range(cpus - 2),
itertools.cycle(_registry.keys())
):
await n.run_in_actor(
consumer,
name=f'consumer_{sub}',
subs=[sub],
)

# make one dynamic subscriber
await n.run_in_actor(
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I'm thinking we could just move the dynamic case into the immediate task here instead of sleeping?

consumer,
name='consumer_dynamic',
subs=list(_registry.keys()),
)

# block until cancelled by user
goodboy marked this conversation as resolved.
Show resolved Hide resolved
with trio.fail_after(3):
await trio.sleep_forever()

try:
trio.run(main)
except trio.TooSlowError:
pass


@tractor.context
async def one_task_streams_and_one_handles_reqresp(

ctx: tractor.Context,

) -> None:

await ctx.started()

async with ctx.open_stream() as stream:

async def pingpong():
'''Run a simple req/response service.

'''
async for msg in stream:
print('rpc server ping')
assert msg == 'ping'
print('rpc server pong')
await stream.send('pong')

async with trio.open_nursery() as n:
n.start_soon(pingpong)

for _ in itertools.count():
await stream.send('yo')
await trio.sleep(0.01)


def test_reqresp_ontopof_streaming():
'''Test a subactor that both streams with one task and
spawns another which handles a small requests-response
dialogue over the same bidir-stream.

'''
async def main():

with trio.move_on_after(2):
async with tractor.open_nursery() as n:

# name of this actor will be same as target func
portal = await n.start_actor(
'dual_tasks',
enable_modules=[__name__]
)

# flat to make sure we get at least one pong
got_pong: bool = False

async with portal.open_context(
one_task_streams_and_one_handles_reqresp,

) as (ctx, first):

assert first is None

async with ctx.open_stream() as stream:

await stream.send('ping')

async for msg in stream:
print(f'client received: {msg}')

assert msg in {'pong', 'yo'}

if msg == 'pong':
got_pong = True
await stream.send('ping')
print('client sent ping')

assert got_pong

try:
trio.run(main)
except trio.TooSlowError:
pass
9 changes: 7 additions & 2 deletions tests/test_streaming.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,13 +32,16 @@ async def async_gen_stream(sequence):

# block indefinitely waiting to be cancelled by ``aclose()`` call
with trio.CancelScope() as cs:
await trio.sleep(float('inf'))
await trio.sleep_forever()
assert 0
assert cs.cancelled_caught


@tractor.stream
async def context_stream(ctx, sequence):
async def context_stream(
ctx: tractor.Context,
sequence
):
for i in sequence:
await ctx.send_yield(i)
await trio.sleep(0.1)
Expand Down Expand Up @@ -338,6 +341,8 @@ async def consume(task_status=trio.TASK_STATUS_IGNORED):
print("all values streamed, BREAKING")
break

cs.cancel()

# TODO: this is justification for a
# ``ActorNursery.stream_from_actor()`` helper?
await portal.cancel_actor()
19 changes: 16 additions & 3 deletions tractor/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,21 @@
from trio import MultiError

from ._ipc import Channel
from ._streaming import Context, stream
from ._streaming import (
Context,
ReceiveMsgStream,
MsgStream,
stream,
context,
)
from ._discovery import get_arbiter, find_actor, wait_for_actor
from ._trionics import open_nursery
from ._state import current_actor, is_root_process
from ._exceptions import RemoteActorError, ModuleNotExposed
from ._exceptions import (
RemoteActorError,
ModuleNotExposed,
ContextCancelled,
)
from ._debug import breakpoint, post_mortem
from . import msg
from ._root import run, run_daemon, open_root_actor
Expand All @@ -21,6 +31,7 @@
'ModuleNotExposed',
'MultiError',
'RemoteActorError',
'ContextCancelled',
'breakpoint',
'current_actor',
'find_actor',
Expand All @@ -33,7 +44,9 @@
'run',
'run_daemon',
'stream',
'wait_for_actor',
'context',
'ReceiveMsgStream',
'MsgStream',
'to_asyncio',
'wait_for_actor',
]
Loading