Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HttpRequest Timeout #2

Open
iprovalo opened this issue Feb 15, 2024 · 3 comments
Open

HttpRequest Timeout #2

iprovalo opened this issue Feb 15, 2024 · 3 comments

Comments

@iprovalo
Copy link

In the ChatClient (or any client) it would be nice to have a timeout for the HttpRequest:

  private HttpRequest createPostRequest(CreateChatCompletionRequest request, Long requestTimeout) {
    return newHttpRequestBuilder(
            Constants.CONTENT_TYPE_HEADER,
            Constants.JSON_MEDIA_TYPE,
            Constants.ACCEPT_HEADER,
            Constants.JSON_MEDIA_TYPE)
            .timeout(Duration.ofMillis(requestTimeout))
        .uri(endpoint)
        .POST(createBodyPublisher(request))
        .build();
  }
@StefanBratanov
Copy link
Owner

Hi @iprovalo thanks for raising this issue. I like adding this functionality. I have been thinking of a cleaner design to implement and came up with setting a request timeout when configuring the OpenAI. Then the timeout will apply to all requests. Should work for most use cases.

OpenAI openAI = OpenAI.newBuilder(System.getenv("OPENAI_API_KEY"))
    .requestTimeout(Duration.ofSeconds(10))
    .build();

This has been implemented as part of 28bb9c9

@iprovalo
Copy link
Author

Thank you, @StefanBratanov, this is very helpful!

I agree it cover most cases!

However, if possible, a per-request flexibility could be very useful, for example, in my case, some request types may be systemically slower than the others, it would help to differentiate them.

Thank you!

@StefanBratanov
Copy link
Owner

StefanBratanov commented Feb 15, 2024

Thank you, @StefanBratanov, this is very helpful!

I agree it cover most cases!

However, if possible, a per-request flexibility could be very useful, for example, in my case, some request types may be systemically slower than the others, it would help to differentiate them.

Thank you!

I agree it would be very nice and flexible to modify the HttpRequest per-request. I will think about a potential design. My main concern is cluttering the API too much. Will keep the issue open for now and update it if I find a solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants