Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: google-vertex-ai forces you to specify the project_id as a parameter #1130

Open
nouamanebenbrahim opened this issue Oct 9, 2024 · 0 comments · May be fixed by #1141
Open

bug: google-vertex-ai forces you to specify the project_id as a parameter #1130

nouamanebenbrahim opened this issue Oct 9, 2024 · 0 comments · May be fixed by #1141

Comments

@nouamanebenbrahim
Copy link

When using the VertexAIGeminiGenerator, I believe it this piece of code here should run without issues, assuming that the GCP_PROJECT_ID and GOOGLE_APPLICATION_CREDENTIALS are also set:

from haystack_integrations.components.generators.google_vertex import VertexAIGeminiGenerator

gemini = VertexAIGeminiGenerator()
result = gemini.run(parts = ["What is the most interesting thing you know?"])
for answer in result["replies"]:
    print(answer)

However, requiring project_id as a parameter hinders our ability to create a multi-LLM service abstraction that leverages various Haystack generators. We aim for generator-agnostic implementation to test and use the best performers.

As you can see in my console the variables are indeed set

image

@julian-risch julian-risch transferred this issue from deepset-ai/haystack-integrations Oct 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants