Is sending and receiving at the same time breaking the rules? #1004
caffeinism
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
httpcore/httpcore/_async/http11.py
Lines 83 to 112 in 38f277c
Of course, it's not the norm, so it can't be a change in the library, but it's pure curiosity. I was writing a program the other day that streams some data to a server, and the server processes the data and responds as a client. As I was writing it, I realized that unlike in my head, most clients can't process the response until all the bodies are sent(because it's blocked at line 88 of the above code). Below is a small snippet of code using fastapi and httpx that implemented my idea.
In this code, the server sends a response as soon as it receives the first body, but the client doesn't get the response (the response object is available after 90 seconds!). What if we replaced that code with the following?
(Of course, this is experimental code, and concurrency can be disastrous for debugging and error-handling)
In this case, you can fill a certain buffer of responses and get a response even before you finish sending your request! What I'm wondering is if this mechanism violates the protocol of HTTP(the timing of returning a response), or if it could actually cause other threats in the code.
Beta Was this translation helpful? Give feedback.
All reactions