Firebolt lets you call large language models (LLMs) directly from SQL through Amazon Bedrock. To invoke a model, use AWS_BEDROCK_AI_QUERY by providing a Bedrock model ID, a JSON request body, and a LOCATION containing your AWS credentials. Alternatively, you can use AI_QUERY to invoke a model with a simple text prompt, a Bedrock endpoint and a location. LLM invocations in Firebolt count towards your account’s daily token budget. For details on how to set your token budget and check your current usage, see the sections below: Set your LLM token budget and Check your LLM token quota usage.
LLM token budget accounting is not available in Firebolt Core.

Create a Bedrock LOCATION

Create a LOCATION once and reuse it wherever you need to call Bedrock models.

Authentication options and examples

  • Access key and secret
CREATE LOCATION bedrock_keys WITH
  SOURCE = AMAZON_BEDROCK
  CREDENTIALS = (
    AWS_ACCESS_KEY_ID = '<aws_access_key_id>'
    AWS_SECRET_ACCESS_KEY = '<aws_secret_access_key>'
  );
  • Temporary credentials (access key, secret, session token)
CREATE LOCATION bedrock_temp_creds WITH
  SOURCE = AMAZON_BEDROCK
  CREDENTIALS = (
    AWS_ACCESS_KEY_ID = '<temporary_access_key_id>'
    AWS_SECRET_ACCESS_KEY = '<temporary_secret_access_key>'
    AWS_SESSION_TOKEN = '<temporary_session_token>'
  );
  • IAM role ARN
CREATE LOCATION bedrock_role WITH
  SOURCE = AMAZON_BEDROCK
  CREDENTIALS = (
    AWS_ROLE_ARN = 'arn:aws:iam::123456789012:role/BedrockAccess'
  );
  • IAM role ARN with external ID
CREATE LOCATION bedrock_role_external_id WITH
  SOURCE = AMAZON_BEDROCK
  CREDENTIALS = (
    AWS_ROLE_ARN = 'arn:aws:iam::123456789012:role/BedrockAccess'
    AWS_ROLE_EXTERNAL_ID = '<external_id>'
  );
To create a LOCATION using an IAM role (with or without an external ID), see Use AWS roles to access Bedrock. For all options and parameters, see CREATE LOCATION (Amazon Bedrock).

Quick examples

Use these examples to try AI in Firebolt. For the full function reference and more details, see AWS_BEDROCK_AI_QUERY and AI_QUERY.

Invoke a model

In the examples below, my_bedrock_location refers to a LOCATION object that you create using one of the methods described above (access keys, temporary credentials, or IAM role).
SELECT AWS_BEDROCK_AI_QUERY(
    'amazon.nova-micro-v1:0',
    $${"schemaVersion": "messages-v1", "messages": [{"role": "user","content": [{"text": "What is the company name that aws belongs to?"}]}],"system": [{"text": "Fulfill the user's request"}],"inferenceConfig": {}}$$,
    'my_bedrock_location') AS result;
select AI_QUERY(
    'us.meta.llama3-3-70b-instruct-v1:0',
    'What is AWS?',
    'my_bedrock_location') as result

Invoke the LLM on multiple rows

SELECT n AS number,
       JSON_POINTER_EXTRACT_TEXT(
         AWS_BEDROCK_AI_QUERY(
           'amazon.nova-micro-v1:0',
           $${"schemaVersion": "messages-v1", "messages": [{"role": "user","content": [{"text": "$$ || n::TEXT || $$"}]}],"system": [{"text": "Write in english the capitalized name of the number that the user writes. Respond with a single word."}],"inferenceConfig": {}}$$,
           'my_bedrock_location'
         ),
         '/output/message/content/0/text'
       ) AS processed
FROM generate_series(1,3) s(n);
select n as number, AI_QUERY(
    'us.meta.llama3-3-70b-instruct-v1:0',
    'Write in english the capitalized name of the number. Respond with a single word: ' || n::TEXT,
    'amit_bedrock_location') as processed
from generate_series(1,3) s(n);

Sentiment analysis

SELECT JSON_POINTER_EXTRACT_TEXT(
  AWS_BEDROCK_AI_QUERY(
    'amazon.nova-micro-v1:0',
    $${"schemaVersion": "messages-v1", "messages": [{"role": "user","content": [{"text": "I love Firebolt."}]}],"system": [{"text": "Identify the sentiment in the user's sentence as a single uppercase word: POSITIVE or NEGATIVE. Respond with a single word."}],"inferenceConfig": {}}$$,
    'my_bedrock_location'
  ),
  '/output/message/content/0/text'
);

Set or change your LLM token budget

Set your account’s daily LLM token budget to control how many tokens AI functions such as AWS_BEDROCK_AI_QUERY and AI_QUERY can process each day. By default, new accounts have a zero token budget.
ALTER ACCOUNT "<account_name>" SET (LLM_TOKEN_BUDGET = 10000);
For full syntax and details, see ALTER ACCOUNT.

Check your LLM token quota and daily usage

SELECT * FROM account_db.information_schema.quotas WHERE name='LLM_TOKEN_BUDGET';
If you exceed the daily budget, invocations of AWS_BEDROCK_AI_QUERY will fail until the limit resets or you increase the budget.
Using LLM functions such as AWS_BEDROCK_AI_QUERY on large tables or with many rows can quickly exhaust your daily LLM token budget and may result in significant costs in your AWS account. Always review your expected token usage and budget before running large-scale AI queries.