Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance hit on websockets when upgrading to channels 2 #943

Closed
remusmp opened this issue Mar 1, 2018 · 5 comments
Closed

Performance hit on websockets when upgrading to channels 2 #943

remusmp opened this issue Mar 1, 2018 · 5 comments

Comments

@remusmp
Copy link

remusmp commented Mar 1, 2018

Hi,

I've been running channels v1.x on a Raspberry pi for a couple of months now and I'm quite satisfied with the performance. However I wanted to test the new channels v2 and I noticed a performance hit when sending a message through websockets. I'm measuring send times in the following way:

v2:

start = time()
MyConsumer.send(data)
print("Elapsed time send: {}".format(time()-start))

where MyConsumer is:

class AConsumer(AsyncJsonWebsocketConsumer):
    groupName = None

    def onConnect(self):
        pass


    async def connect(self):
        self.onConnect()
        await self.accept()
        await self.channel_layer.group_add(self.groupName, self.channel_name)


    async def disconnect(self, code):
        await self.channel_layer.group_discard(
            self.groupName,
            self.channel_name,
        )


    async def group_message(self, event):
        await self.send_json(event['json'])


    @classmethod
    def send(cls, json):
        channelLayer = get_channel_layer()
        async_to_sync(channelLayer.group_send)(cls.groupName, {
            "type": "group.message",
            "json": json
        })

class MyConsumer(AConsumer):
    groupName = "test"

    def onConnect(self):
        # I skip this code
        pass

Here are the results with v2:

Elapsed time send: 0.24855780601501465
Elapsed time send: 0.25104665756225586
Elapsed time send: 0.24920940399169922
Elapsed time send: 0.25311732292175293
Elapsed time send: 0.24442577362060547
Elapsed time send: 0.2445998191833496
Elapsed time send: 0.2364206314086914
Elapsed time send: 0.24710321426391602
Elapsed time send: 0.2509162425994873
Elapsed time send: 0.24585413932800293

Server is started this way:

daphne -b 0.0.0.0 -p 8000 app.asgi:application

Here's how I measure time in v1.x:

start = time()
Group('test').send({'text': json.dumps(data)})
print("Elapsed time send: {}".format(time()-start))

Here are the results with v1.x (same data is being sent):

Elapsed time send: 0.041016340255737305
Elapsed time send: 0.04160451889038086
Elapsed time send: 0.04173541069030762
Elapsed time send: 0.04191994667053223
Elapsed time send: 0.04093027114868164
Elapsed time send: 0.040888071060180664
Elapsed time send: 0.042260169982910156
Elapsed time send: 0.04185986518859863
Elapsed time send: 0.04205727577209473
Elapsed time send: 0.04068183898925781

Servers are started this way in v1.x:

nohup daphne -b 0.0.0.0 -p 8000 app.asgi:channel_layer & > daphne.log
nohup python manage.py runworker &

pip versions in v1.x:

channels==1.1.6
daphne==1.3.0
Django==2.0.2
asgi-redis==1.4.2
asgiref==1.1.2
Twisted==17.5.0

pip versions in v2:

channels==2.0.2
channels-redis==2.1.0
daphne==2.0.4
Django==2.0.2
aioredis==1.0.0
asgiref==2.1.6
Twisted==17.9.0

Could you give me a hint what to change in order to have a similar performance as in v1.x? Many thanks in advance!

@remusmp
Copy link
Author

remusmp commented Mar 1, 2018

Is there a 200ms delay/sleep somewhere behind the scenes? I've just tested on a i7 PC and I'm getting pretty much the same timings :( I just wanted to add this because I don't think it's the rpi slowing things down.

@andrewgodwin
Copy link
Member

The only built-in sleep we have anywhere is a 10ms one in the channel layer receive method, and you're not using that here.

To clarify, are you saying that just the channel layer send is slow? That would be a very different problem to self.send on the consumer being slow.

@remusmp
Copy link
Author

remusmp commented Mar 1, 2018

I've done a few more tests on my home PC and it seems that when using groups things are much slower:
I've created 3 different echo consumers in order to compare them:

  1. AsyncJsonWebsocketConsumer with groups
  2. AsyncConsumer with groups
  3. AsyncConsumer without groups

I got these timing results:

1.
Elapsed time 0.026857614517211914.
Elapsed time 0.026637554168701172.
Elapsed time 0.03318667411804199.
Elapsed time 0.03062891960144043.
Elapsed time 0.026413679122924805.
Elapsed time 0.028857707977294922.
Elapsed time 0.03323197364807129.
Elapsed time 0.026538610458374023.
Elapsed time 0.036324501037597656.
Elapsed time 0.0322566032409668.
Elapsed time 0.0385587215423584.
Elapsed time 0.02066636085510254.
Elapsed time 0.03473186492919922.
Elapsed time 0.03095531463623047.
Elapsed time 0.022891759872436523.

2.
Elapsed time 0.02672290802001953.
Elapsed time 0.022529125213623047.
Elapsed time 0.042157888412475586.
Elapsed time 0.028131961822509766.
Elapsed time 0.035176753997802734.
Elapsed time 0.03284716606140137.
Elapsed time 0.03023362159729004.
Elapsed time 0.03440999984741211.
Elapsed time 0.029916763305664062.
Elapsed time 0.03461766242980957.

3. 
Elapsed time 0.00019168853759765625.
Elapsed time 9.107589721679688e-05.
Elapsed time 0.00020265579223632812.
Elapsed time 0.0004429817199707031.
Elapsed time 0.0002741813659667969.
Elapsed time 0.00019693374633789062.
Elapsed time 0.00026679039001464844.
Elapsed time 0.0003097057342529297.
Elapsed time 0.00017261505126953125.
Elapsed time 0.0003085136413574219.
Elapsed time 0.00048232078552246094.
Elapsed time 0.0002384185791015625.

It looks like groups make things ~100 times slower.

Here are the consumers:

from channels.generic.websocket import AsyncJsonWebsocketConsumer, AsyncConsumer
from time import time
from channels.layers import get_channel_layer
from asgiref.sync import async_to_sync, sync_to_async

class EchoConsumer1(AsyncJsonWebsocketConsumer):

    async def connect(self):
        await self.accept()
        await self.channel_layer.group_add("test", self.channel_name)

    async def disconnect(self, close_code):
        await self.channel_layer.group_discard("test", self.channel_name)

    async def receive_json(self, content):
        start = time()
        await self.channel_layer.group_send(
            "test",
            {
                "type": "chat.message",
                "json": content
            }
        )
        print("Elapsed time {}.".format(time()-start))

    async def chat_message(self, event):
        await self.send_json(event["json"])


class EchoConsumer2(AsyncConsumer):

    async def websocket_connect(self, event):
        await self.send({
            "type": "websocket.accept",
        })
        await self.channel_layer.group_add("test", self.channel_name)

    async def websocket_send(self, event):
        await self.send(event)

    async def websocket_receive(self, event):
        start = time()
        channel_layer = get_channel_layer()
        await channel_layer.group_send(
            "test",
            {
                "type": "websocket.send",
                "text": event["text"],
            }
        )
        print("Elapsed time {}.".format(time()-start))


class EchoConsumer3(AsyncConsumer):

    async def websocket_connect(self, event):
        await self.send({
            "type": "websocket.accept",
        })

    async def websocket_receive(self, event):
        start = time()
        await self.send({
            "type": "websocket.send",
            "text": event["text"],
        })
        print("Elapsed time {}.".format(time()-start))

@andrewgodwin
Copy link
Member

I can believe that - could you open this as a channels_redis ticket, then, please, and include the info above? There's probably an easy optimisation to make.

@sebhaase
Copy link

Here is the link to the channels_redis ticket: django/channels_redis#83

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants