Monitor costs and usage with LLM analytics

Understand your users, models and costs with powerful analytics designed specifically for LLMs.

01
Features

Make better, data-driven decisions

Usage monitoring

Track and analyze usage, requests, tokens, cache, and more.

Cost analytics

Understand how much users, requests and models are costing you.

LLM optimization

Easily add remote cache, rate limits, and auto retries (coming soon).
02
· Analytics

Search and filter through your data

Search by keyword

Use our powerful search function to find keywords across all of your data tables.

Filter by column

Look up keywords with exact or fuzzy match, filtering as many columns as you need.
03
· Setup

Setup with two lines of code

1
curl --request POST \
2
   --url https://openai.flowstack.ai/v1/chat/completions \
3
   --header 'Authorization: Bearer OPENAI_API_KEY' \
4
   --header 'Flowstack-Auth: Bearer FLOWSTACK_API_KEY' \
5
   --header 'Content-Type: application/json' \
6
   --data '{
7
       "model": "gpt-3.5-turbo",
8
       "messages": [
9
           {
10
               "role": "system",
11
               "content": "Tell me a funny joke"
12
           }
13
       ],
14
       "temperature": 1,
15
       "max_tokens": 50
16
}'
1
Coming soon!
2
                    
3
                    
4
                    
5
                    
6
                    
7
                    
8
                    
9
                    
10
                    
11
                    
12
                    
13
                    
14
                    
15
                    
16
                    
1
Coming soon!
2
                    
3
                    
4
                    
5
                    
6
                    
7
                    
8
                    
9
                    
10
                    
11
                    
12
                    
13
                    
14
                    
15
                    
16
                    
04
· pricing

Use Flowstack for free

Flowstack is completely free — all we ask in return is feedback to help us get better.

Free
$

0

/month
Monitor all your LLMs from one place
Start for free
Use Flowstack for free
add
Unlimited API requests
add
Analytics dashboard and data tables
add
Remote caching and rate limits (coming soon)
add
Priority support
Enterprise

Custom

Specialized or white-label solutions
Coming soon
Custom setups such as:
add
SLAs and enhanced security
add
Optional self-hosting on AWS, GCP or Azure
add
White-label and embedded analytics
add
Dedicated support
05
Compatibility

Monitor all your LLMs from one place

Compatible with OpenAI, Anthropic, AI21, AWS Bedrock and GCP Vertex AI — more coming soon.

Available
now
Available
now
Available
now
Available
now
Available
now
COMING
SOON