Skip to content

Fix: Force VERTEXAI_LOCATION for Gemini 2.5 on Vertex AI #1857

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

PeterDaveHello
Copy link
Contributor

@PeterDaveHello PeterDaveHello commented Jun 8, 2025

User description

Fix Vertex AI location handling for specific Gemini models

Automatically set appropriate VERTEXAI_LOCATION for two Gemini models:

  • gemini-2.5-pro-preview-06-05: Force to "global" location as this
    model is only available globally on Vertex AI
  • gemini-2.5-flash-preview-05-20: Set to "us-central1" for US regions
    or "global" for non-US regions based on model availability constraints

This prevents "Publisher Model not found" errors by overriding user
configuration when necessary. Both litellm.vertex_location and
VERTEXAI_LOCATION environment variable are updated automatically.

Addresses issues discussed in BerriAI/litellm#11447 where manual
location configuration was required for these specific models.

Summary from GitHub Copilot:

This pull request introduces logic to handle specific location requirements for certain Vertex AI Gemini models in the litellm_ai_handler.py file. It dynamically adjusts the VERTEXAI.VERTEX_LOCATION setting based on the model configuration and ensures compatibility with the models' constraints.

Vertex AI Gemini model location handling:

  • Added logic to check the model name from the configuration (model_name_from_config) and adjust the VERTEXAI.VERTEX_LOCATION setting accordingly:
    • For gemini-2.5-pro-preview-06-05, the location is overridden to 'global' if it is not already set to 'global'.
    • For gemini-2.5-flash-preview-05-20, the location is corrected to 'us-central1' for US regions if it is not already 'us-central1'. Non-US regions are set to 'global'.
  • Updated the environment variable VERTEXAI_LOCATION to reflect the new location when a change is made.

cc #11447


PR Type

Bug fix, Enhancement


Description

  • Adds automatic Vertex AI location override for Gemini 2.5 models.

    • Forces VERTEXAI_LOCATION to global for gemini-2.5-pro-preview-06-05.
    • Sets VERTEXAI_LOCATION to us-central1 or global for gemini-2.5-flash-preview-05-20 as needed.
  • Updates both litellm.vertex_location and environment variable for compatibility.

  • Improves logging to inform users of location overrides.


Changes diagram

flowchart LR
  A["Model config loaded"] -- "Is Vertex AI model?" --> B["Check model name"]
  B -- "gemini-2.5-pro-preview-06-05" --> C["Set location to 'global'"]
  B -- "gemini-2.5-flash-preview-05-20 (US)" --> D["Set location to 'us-central1'"]
  B -- "gemini-2.5-flash-preview-05-20 (non-US)" --> E["Set location to 'global'"]
  C -- "Update litellm & env" --> F["Location override complete"]
  D -- "Update litellm & env" --> F
  E -- "Update litellm & env" --> F
Loading

Changes walkthrough 📝

Relevant files
Bug fix
litellm_ai_handler.py
Auto-detect and override Vertex AI location for Gemini 2.5 models

pr_agent/algo/ai_handlers/litellm_ai_handler.py

  • Adds logic to detect Gemini 2.5 models on Vertex AI.
  • Automatically sets litellm.vertex_location and VERTEXAI_LOCATION as
    required.
  • Provides detailed logging for location overrides.
  • Handles both Pro and Flash preview model variants.
  • +37/-0   

    Need help?
  • Type /help how to ... in the comments thread for any questions about Qodo Merge usage.
  • Check out the documentation for more information.
  • …Vertex
    
    For certain models on Vertex AI, specifically the new Gemini 2.5 Pro Preview
    (e.g., gemini-2.5-pro-preview-06-05), the VERTEXAI_LOCATION must be set
    to "global" for the model to function correctly.
    
    This change modifies the LiteLLMAIHandler to automatically set
    litellm.vertex_location and the environment variable VERTEXAI_LOCATION
    to "global" when such a model is selected and the provider is Vertex AI.
    This ensures you can use these models without manual configuration
    or encountering "Publisher Model not found" errors.
    
    Addresses the issue highlighted in the discussion around
    BerriAI/litellm#11447, where users
    confirmed that os.environ["VERTEXAI_LOCATION"] = "global" resolves
    the problem.
    This commit refines the handling of Vertex AI model locations.
    Specifically, it ensures that when `gemini-2.5-pro-preview-0605`
    is used on Vertex AI, the `litellm.vertex_location` and the
    environment variable `VERTEXAI_LOCATION` are explicitly set to "global".
    
    This is necessary because, as per Google's official documentation
    (https://cloud.google.com/vertex-ai/generative-ai/docs/learn/locations#available-regions),
    the `gemini-2.5-pro-preview-0605` model is only available in the
    'global' location. This change prevents "Publisher Model not found"
    errors for this specific model.
    
    This commit supersedes a previous attempt that had a broader condition
    and ensures the fix is targeted precisely as per your feedback and
    documentation.
    This commit correctly sets the VERTEXAI_LOCATION to "global"
    for the Vertex AI model `gemini-2.5-pro-preview-06-05` (with hyphen).
    
    The model name `gemini-2.5-pro-preview-06-05` is used as specified in
    Google's official documentation:
    https://cloud.google.com/vertex-ai/generative-ai/docs/models/gemini/2-5-pro
    
    This ensures that when this specific model is used,
    `litellm.vertex_location` and the environment variable
    `VERTEXAI_LOCATION` are explicitly set to "global", preventing
    "Publisher Model not found" errors.
    
    This commit supersedes previous attempts which either had a too broad
    condition or used an incorrect model name format (without hyphen).
    The logic now accurately targets the specified model.
    This commit implements specific location handling for two Vertex AI
    Gemini models:
    
    1.  `gemini-2.5-pro-preview-06-05`:
        - As per Google's documentation, this model is available in the 'global'
          location.
        - This change forces `litellm.vertex_location` and the
          `VERTEXAI_LOCATION` environment variable to "global" when this
          model is used, overriding other settings to prevent errors.
    
    2.  `gemini-2.5-flash-preview-05-20`:
        - This model is available in specific regions, including `us-central1`
          for the US, but not all US regions (e.g., not `us-east1` or `us-west1`).
        - If this model is selected and you have configured a US-based
          `VERTEXAI_LOCATION` other than `us-central1`, this change
          automatically corrects the location to `us-central1`.
        - This helps prevent errors when you inadvertently select an
          unsupported US region for this model.
    
    Both model names are checked with hyphens (e.g., "06-05") as per
    primary Google documentation.
    
    This consolidated fix addresses issues discussed regarding model
    availability and required location settings on Vertex AI.
    This commit implements the definitive specific location handling
    for two Vertex AI Gemini models, based on your detailed feedback
    and my investigation:
    
    1.  `gemini-2.5-pro-preview-06-05`:
        - This model requires the 'global' location on Vertex AI.
        - This change forces `litellm.vertex_location` and the
          `VERTEXAI_LOCATION` environment variable to "global" when this
          model is used, overriding other settings.
    
    2.  `gemini-2.5-flash-preview-05-20`:
        - If a US-based `VERTEXAI_LOCATION` (e.g., "us-east1") other than
          "us-central1" is configured by you, it is automatically
          corrected to "us-central1".
        - If any non-US `VERTEXAI_LOCATION` is configured by you and
          it's not already "global", it is set to "global".
        - This logic addresses the model's specific availability constraints
          as per your provided information (US only in us-central1,
          others via global).
    
    Both model names are checked with hyphens (e.g., "06-05") and
    comparisons are case-insensitive. Appropriate logging is added for
    when location overrides occur.
    
    This commit supersedes all previous attempts and incorporates the
    most up-to-date understanding of these models' location requirements.
    Copy link
    Contributor

    PR Reviewer Guide 🔍

    Here are some key observations to aid the review process:

    ⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
    🧪 No relevant tests
    🔒 No security concerns identified
    ⚡ Recommended focus areas for review

    Indentation Error

    The indentation in the conditional block for non-US regions appears to be inconsistent. Line 114 has one more space than expected compared to the surrounding code.

    get_logger().info(
       f"Model '{model_name_from_config}' on Vertex AI: User set location '{current_vertex_location}' "
    Error Handling

    The code doesn't handle potential errors when setting environment variables or when the model name format changes. Consider adding try-except blocks or more robust string parsing.

    model_name_from_config = get_settings().config.model
    model_name_lower = model_name_from_config.lower() # For case-insensitive checks
    current_vertex_location = get_settings().get("VERTEXAI.VERTEX_LOCATION", None)
    # Convert current_vertex_location to lower case for comparison, handling None
    current_vertex_location_lower = str(current_vertex_location).lower() if current_vertex_location else ""
    
    new_vertex_location = None # Initialize to None, will be set if a change is needed
    
    if "vertexai" in model_name_lower: # Only apply to vertexai models
        if "gemini-2.5-pro-preview-06-05" in model_name_lower:
            if current_vertex_location_lower != "global":
                get_logger().info(
                    f"Model '{model_name_from_config}' on Vertex AI requires 'global' location. "
                    f"Overriding current setting ('{current_vertex_location}') with 'global'."
                )
                new_vertex_location = "global"
        elif "gemini-2.5-flash-preview-05-20" in model_name_lower:
            if current_vertex_location_lower.startswith("us-") and current_vertex_location_lower != "us-central1":
                get_logger().info(
                    f"Model '{model_name_from_config}' on Vertex AI: User set US location '{current_vertex_location}' "
                    f"is not 'us-central1'. Correcting to 'us-central1' as per specific model requirements."
                )
                new_vertex_location = "us-central1"
            elif not current_vertex_location_lower.startswith("us-"): # Non-US regions
                if current_vertex_location_lower != "global":
                     get_logger().info(
                        f"Model '{model_name_from_config}' on Vertex AI: User set location '{current_vertex_location}' "
                        f"is not a US region. Setting to 'global' as per specific model requirements for non-US or other cases."
                    )
                     new_vertex_location = "global"
            # If it's 'us-central1' already, new_vertex_location remains None, no change needed.
            # If it's 'global' already (for non-US), new_vertex_location remains None, no change needed.
    
        if new_vertex_location:
            litellm.vertex_location = new_vertex_location
            os.environ["VERTEXAI_LOCATION"] = new_vertex_location

    Copy link
    Contributor

    PR Code Suggestions ✨

    Explore these optional code suggestions:

    CategorySuggestion                                                                                                                                    Impact
    Possible issue
    Improve model detection logic

    The condition checks if "vertexai" is in the model name, but the model might be
    specified without this prefix in the config. Consider checking for Gemini model
    patterns directly to ensure all Vertex AI Gemini models are properly handled.

    pr_agent/algo/ai_handlers/litellm_ai_handler.py [97-104]

    -if "vertexai" in model_name_lower: # Only apply to vertexai models
    -    if "gemini-2.5-pro-preview-06-05" in model_name_lower:
    -        if current_vertex_location_lower != "global":
    -            get_logger().info(
    -                f"Model '{model_name_from_config}' on Vertex AI requires 'global' location. "
    -                f"Overriding current setting ('{current_vertex_location}') with 'global'."
    -            )
    -            new_vertex_location = "global"
    +# Check for Gemini models regardless of whether "vertexai" is in the name
    +if "gemini-2.5-pro-preview-06-05" in model_name_lower:
    +    if current_vertex_location_lower != "global":
    +        get_logger().info(
    +            f"Model '{model_name_from_config}' on Vertex AI requires 'global' location. "
    +            f"Overriding current setting ('{current_vertex_location}') with 'global'."
    +        )
    +        new_vertex_location = "global"
    • Apply / Chat
    Suggestion importance[1-10]: 4

    __

    Why: The suggestion has merit but may introduce unintended side effects. The "vertexai" check ensures this logic only applies to Vertex AI models and prevents applying Vertex AI-specific location settings to the same model names used with other providers like Google AI Studio.

    Low
    • More
    • Author self-review: I have reviewed the PR code suggestions, and addressed the relevant ones.

    @PeterDaveHello PeterDaveHello marked this pull request as draft June 8, 2025 10:28
    @PeterDaveHello
    Copy link
    Contributor Author

    Should not do this here 😅

    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Projects
    None yet
    Development

    Successfully merging this pull request may close these issues.

    1 participant