Skip to content

Execution Environment

When your tool runs on Rival, it executes in a managed, isolated environment. Understanding what that environment looks like helps you write handlers that behave predictably and stay within platform limits.


Isolation and statefulness

Every execution runs in its own isolated environment. There is no shared memory between executions - not between two calls to the same tool, and not between different tools in the same organization. Each invocation starts clean.

This means you cannot use in-memory variables, module-level state, or global caches to persist data between calls. If your tool needs to remember something between invocations, it must read from and write to an external store. See Storage for guidance on your options.


Resource limits

Each execution has access to the following resources:

ResourceLimit
Memory512 MB
Timeout300 seconds
CPU2.0 cores

If your handler exceeds the timeout, the execution is terminated and the caller receives a 408 response. Memory overruns also terminate the execution. Design your handlers to complete within these bounds - if you are doing heavy processing, consider breaking work into smaller executions or offloading expensive steps to external services.


Environment variables

Environment variables defined in your workspace are injected into the execution environment at runtime. Your handler can read them using the standard mechanism for its language (os.environ in Python, process.env in JavaScript).

Environment variables are set at the workspace level in Workspace Settings → API and assigned to specific tools from the tool’s settings. They are a good place to store API keys, endpoint URLs, and other configuration that should not be hardcoded into your handler.


Python environment

Python tools run with access to any packages listed in your requirements.txt file. Rival installs those dependencies before your handler runs, so there is a brief startup cost on cold starts.

Your main handler file must be named cortexone_function.py and must define a function called cortexone_handler(event, context). The event argument contains the input payload; context provides execution metadata.

def cortexone_handler(event, context):
name = event.get("name", "world")
return {
"statusCode": 200,
"body": {"message": f"Hello, {name}!"}
}

Python supports async execution, so you can use standard I/O and network calls in your handler.


JavaScript environment

JavaScript tools run synchronously. Async/await and Promises are not supported - your handler must complete its work without yielding to an event loop. This makes JavaScript well-suited for CPU-bound logic and in-memory transformations, but not for making HTTP requests or other I/O-bound operations.

There is no package manager in the JavaScript runtime. Your handler must be self-contained, using only JavaScript built-ins.

function handler(event) {
const name = event.name || "world";
return {
statusCode: 200,
body: { message: `Hello, ${name}!` }
};
}

Lua environment

Lua tools also run synchronously. The standard Lua library is available, but there is no package manager and no support for async I/O. Lua is fastest for lightweight, deterministic computation.

function handler(event)
local name = event.name or "world"
return {
statusCode = 200,
body = { message = "Hello, " .. name .. "!" }
}
end

What the environment does not have

The execution environment is intentionally minimal. There is no persistent local filesystem - files written during an execution do not survive after it completes. There is no network file system mounted by default. For persistent file access, use Digital Assets, which provides stable file paths that can be read within your handler.

There is also no outbound network access restriction - your handler can make HTTP requests to external APIs as long as the runtime supports it (Python yes, JavaScript and Lua are synchronous so network calls are not practical).