Skip to main content

How Aisle Works

Aisle is an orchestration layer for AI across your team. It connects multiple AI models, your data, your tools, and your people in one place - without locking you into any single model provider.

The core idea: instead of each person crafting their own prompts and picking their own models, your team builds shared, reusable AI components that everyone can run consistently.

The building blocks

Models

Aisle connects to models from Anthropic, OpenAI, Google, xAI, and OpenRouter. You choose the model; Aisle handles the connection. Switch models at any time - your prompts, workflows, and settings stay exactly as they are.

Prompts

A prompt is a reusable, versioned AI instruction. You write it once, configure the model and settings, and publish it. Anyone in your organization can run it from the chat launcher without knowing how it was built.

Prompts are the unit of reuse. The same prompt can be run in chat, called from a workflow, triggered via API, or used as a tool in a Project.

Chat

Chat is the main interface most users work from. You select a model or a library item (a prompt or workflow your team has built), type a message, and get a response. Chat is conversational - follow-ups, revisions, and direction changes are all part of the same thread.

Chat also supports file attachments, tool use (web search, MCP connectors), and collaborative sharing - you can invite teammates to a conversation and they can contribute messages directly.

Workflows

A workflow chains prompts, integrations, and logic into a multi-step automation. A workflow can read from an external service, process the data with a prompt, apply conditions, loop over items, and write results back - all without code.

Workflows are triggered by users in chat, via API, on a schedule, or by external webhooks.

Memories

Memories are documents your workflows can create, read, update, and search. They persist between runs, letting workflows build up context over time.

With embeddings enabled, a memory folder becomes a semantic knowledge base. Workflows can search it by meaning (not just keywords) and use the results as context for a prompt. This is how you build RAG workflows - retrieve relevant content, inject it into a prompt, get grounded answers.

Memories can also be exposed as an MCP server, making a folder accessible to external agents and tools (like Claude Code or Cursor). This is a separate, opt-in capability - see Memories MCP Server.

Playgrounds

Playgrounds are a testing workspace where you run multiple prompt configurations side-by-side with identical inputs. Load the same prompt at two different versions, or run it against Claude and GPT simultaneously. The outputs appear in columns so you can compare them directly.

When you save a Playground, the full run state is preserved - column configurations, inputs, and the actual outputs. Share the link and your teammates see exactly what you saw. This makes Playgrounds useful for documenting decisions: link to a saved Playground from a PR description or a Notion doc to show the evidence behind a model or prompt choice.

Scripts

Scripts are Python or Ruby functions. Use them in workflows when you need deterministic code logic - data transformation, API calls, custom formatting - rather than an AI instruction.

Connectors

Connectors link Aisle to external services. A connector can appear in three places:

  • As a workflow node - used inside a workflow to read or write data
  • As a chat tool (MCP) - available to the model during a conversation
  • As a webhook trigger - fires a workflow when an event happens in the external service

Projects

A Project is a workspace that bundles prompts, workflows, scripts, connectors, and memory folders together with a set of instructions and a default model. When you start a chat inside a Project, everything assigned to it is already active.

Projects are the highest-level concept in Aisle - they let teams organise their AI operations around a purpose (a team, a client, a product) rather than managing individual tools.

How they connect

Models          → run inside Chat, Prompts, and Workflows
Prompts → reusable instructions, run in Chat or as Workflow nodes
Chat → conversational interface; runs Models, Prompts, and Workflows
Workflows → chains of Prompts, Scripts, and Connectors
Memories → persistent storage, searchable by Workflows and Projects
Playgrounds → side-by-side testing of Prompts across models and versions
Scripts → code functions, usable as Workflow nodes
Connectors → external service links, usable in Workflows and Chat
Projects → workspaces that bundle all of the above

A typical team setup: admins connect the services the company uses (Connectors), build the core AI instructions (Prompts), assemble automations (Workflows), store institutional knowledge (Memories), and surface everything in focused workspaces (Projects). Team members run from chat without needing to understand how it works internally.

What Aisle is not

Aisle is not a code assistant or a general-purpose chatbot. It is an operations layer - the infrastructure your team uses to build, share, govern, and run AI consistently.