Quickstart
Get Curiosity Workspace running locally in minutes using Docker. This page gets you to a working UI so you can poke around. For an end-to-end developer walkthrough (schema → connector → search → AI tool → permission-aware endpoint), see Build your first enterprise AI app.
Local development only
The commands on this page bind the workspace to your machine with default credentials and no TLS. Do not use this configuration on a shared network or in production. For production deployment, follow the Installation guides for your target platform and the Production deployment checklist.
Prerequisites
- Docker installed and running. Get Docker.
- 8 GB of RAM available to the container (16 GB recommended once you load real data).
- A free local port (default:
8080).
If you plan to enable AI features in step 4, you also need an API key for an LLM provider (OpenAI, Azure OpenAI, Anthropic, or a local model server). See LLM Configuration.
Step 1: Start Curiosity Workspace
The Curiosity Workspace container packages the graph, search, and AI runtime in a single image.
mkdir -p ~/curiosity/storage
docker run --name curiosity \
-p 127.0.0.1:8080:8080 \
-v ~/curiosity/storage/:/data/ \
-e MSK_GRAPH_STORAGE=/data/curiosity \
-e MSK_ADMIN_PASSWORD="$(openssl rand -base64 24)" \
curiosityai/curiosity
We deliberately bind to 127.0.0.1 so the container is only reachable from your machine. MSK_ADMIN_PASSWORD replaces the default admin/admin credentials with a generated value — read it from the variable (or your shell history) when you log in. First boot takes a few moments while the database and indexes initialize.
If you omit MSK_ADMIN_PASSWORD, the workspace falls back to admin / admin. This is convenient when you're poking around alone on 127.0.0.1, but rotate the password from Settings → Accounts → Users before the workspace is reachable from anywhere else. See the full environment-variable list in the Configuration reference.
To keep the container running in the background, add -d. To stop it: docker stop curiosity.
Step 2: Sign in and complete first-boot setup
Open http://localhost:8080 in your browser.
Sign in with username admin and the password you set via MSK_ADMIN_PASSWORD (or admin if you skipped the variable). Before doing anything else:
- Open Settings → Accounts → Users and either rotate the
adminpassword or invite a new admin account and deleteadmin. - If you'll expose this workspace to anyone other than yourself, complete the security baseline checklist before continuing.
Then follow the setup wizard to name your workspace.
Step 3: Connect a data source
A workspace is only useful when it has data in it. Pick the simplest source you have access to:
- Navigate to Settings → Integrations.
- Choose a built-in connector (for example Filesystem for local files, or Web for a public URL).
- Provide the connection details and start the initial sync.
- Watch ingestion progress under Settings → Tasks and Settings → Monitoring.
If you'd rather see what production-quality ingestion looks like, the HackerNews example and the Technical Support tutorial walk through complete C# connectors.
Step 4: Configure embeddings and an LLM (optional)
This unlocks vector/hybrid search and the chat assistant.
- Navigate to Settings → AI Settings.
- Add a provider for embeddings (used for vector and hybrid search). Common choices: OpenAI
text-embedding-3-small, Azure OpenAI deployment, or a local embedding server. See Embeddings. - Add a provider for chat / generation (used by the chat assistant and AI tools). See LLM Configuration.
- Trigger a re-index so existing content gets embedded: Settings → Maintenance → Rebuild indexes.
Step 5: Search and ask
- Open the Search view to test keyword + hybrid retrieval on the data you ingested.
- Open the Chat view to ask grounded questions with citations.
- Compare both with Chat vs Search.
Step 6: Validate persistence
Stop and restart the container, then confirm:
- Your workspace name, users, and configuration survived restart.
- Your ingested data is still searchable.
docker stop curiosity && docker start curiosity
If anything is missing after restart, your volume mount is not persisting — see Common installation pitfalls.
Where to go next
- Build something real with the end-to-end developer journey: Build your first enterprise AI app.
- Understand the platform: Architecture, Data Flow, Graph Model.
- Move toward production: Installation guides per platform, Deployment checklist, Backup & restore.
- Integrate via API: API Overview, Custom Endpoints.