A Runtime for your LLMs
Beanbox is a place where you can safely and hassle-free execute code from your LLMs
As simple as a post request
We think deploying and running code remotely should be as easy as making a post request.
So that's exactly what we built.
This 'hello world' takes literally 1 minute. So let's get started!
1. Create an account
By clicking this button 👉 Sign in
2. Run some code
We'll create two files. One code.py
which is the code we want to run, and a main.py
which sends code.py to beanbox.
Create a file named code.py
like this:
import numpy as np
def main():
# Generate a random dataset of 10 data points
data = np.random.rand(10)
mean_value = np.mean(data)
std_deviation = np.std(data)
# Example: get some descriptive stats
print(f"Mean: {mean_value}")
print(f"Standard Deviation: {std_deviation}")
# Example: Normalize data to have a mean of 0 and standard deviation of 1
normalized_data = (data - mean_value) / std_deviation
print(f"Normalized Data: {normalized_data}")
if __name__ == "__main__":
main()
And in the same directory, create a file called main.py
that looks like this:
import requests
def main():
# Auth and url for beanbox
url = "https://api.beanbox.dev/run"
headers = {
"Content-Type": "application/json",
"Authorization": "Bearer <YOUR-API-TOKEN>"
}
# Grab the code from the file we want to run
# You can replace this with a call to an LLM that gives you code
# you want to run
with open("./code.py") as f:
data = {
"code": f.read()
}
# Send it to beanbox
response = requests.post(url, headers=headers, json=data)
print(response.json())
if __name__ == "__main__":
main()
This file imports the code.py
and sends it to beanbox. You can have beanbox execute Python that you've written, but it really shines with executing code by LLMs.
3. Done!
And you should get something like this in the stdout & stderr as a response:
{
'stdout': 'Normalized Data: [ 0.19756498 1.04981773 -1.27007258 ...]',
'stderr': ''
}
Frequent Questions
Let's answer some questions
What launguages are supported?
Currently, we only support Python 3.11
How do I install packages?
The runtime is based on Pyodide. So regular pip install doesn't work out of the box.
Currently we provide a set of preinstalled packages. And in the future we'll introduce a way to install more on the fly.
Currently available packages are:
- - Numpy
- - Pandas
- - Scipy
- - Matplotlib
What does the pricing look like?
For now we'll keep the service free. Once we reach a more stable point we'll intoduce a mainly
usage-based pricing.
What about scaling?
You should expect the service to scale "infinitely", similarly to how AWS Lambda scales. However at the
moment we're still running this on a few test servers with limited capacity and queue up the calls. So for
a higher number of parallel requests you might experience some delay.
How do I get in touch?
Twitter DMs are the way to go.