Comprehensive Installation Guide
This guide covers all installation methods for Lobster AI, from quick setup to advanced development configurations.
This guide covers all installation methods for Lobster AI, from quick setup to advanced development configurations.
⚠️ Breaking Changes in v0.4.0: Lobster now requires explicit provider configuration via
lobster init. Auto-detection from.envfiles has been removed.Existing users: Run
lobster initto create the newprovider_config.jsonfile. Your API keys in.envwill continue to work.
Table of Contents
- Prerequisites
- Installation Methods
- Platform-Specific Instructions
- Verification
- Uninstalling Lobster AI
- Development Installation
- Optional Dependencies
- Docker Deployment
- Troubleshooting
Prerequisites
System Requirements
Minimum Requirements:
- Python: 3.11-3.14 (3.12+ recommended)
- Memory: 4GB RAM (8GB+ recommended for large datasets)
- Storage: 2GB free space (more for data analysis)
- Network: Internet connection for API access and data downloads
Recommended Setup:
- Python: 3.12-3.14
- Memory: 16GB+ RAM
- Storage: 10GB+ free space
- CPU: Multi-core processor for parallel analysis
Package Manager Recommendations
Lobster AI automatically detects and uses the best available package manager:
- uv (fastest, recommended): Install uv
- pip3 (macOS default)
- pip (fallback)
We recommend installing uv for significantly faster package installation and dependency resolution.
Required API Keys
Easy Setup: Lobster requires explicit provider configuration (v0.4.0+). Run the interactive setup wizard to configure:
lobster initThe wizard creates two files:
provider_config.json- Provider/model selection (safe to commit to git).env- API keys and secrets (never commit to git)
Choose ONE of the following LLM providers:
-
Anthropic API Key (Recommended for most users)
⚠️ Important: Rate Limits - Anthropic applies conservative rate limits to new accounts. For production use or heavy workloads, we recommend AWS Bedrock. If you encounter rate limit errors, see Troubleshooting Guide.
- Visit Anthropic Console
- Create account and generate API key
- The wizard will prompt you to enter:
ANTHROPIC_API_KEY=sk-ant-... - Recommended for: Quick testing, development with small datasets
- Not recommended for: Production deployments, large-scale analysis
-
AWS Bedrock Access (Production/Enterprise)
✅ Best for production - AWS Bedrock provides enterprise-grade rate limits and reliability. Recommended for heavy workloads and production deployments.
- AWS account with Bedrock access
- Create IAM user with Bedrock permissions
- The wizard will prompt you to enter:
AWS_BEDROCK_ACCESS_KEY=... AWS_BEDROCK_SECRET_ACCESS_KEY=... - Recommended for: Production deployments, large-scale analysis, enterprise use
- Benefits: Higher rate limits, better reliability, enterprise SLA
-
Ollama (Local, Zero Cost)
🏠 Privacy-first - Run models locally on your hardware. No API keys, no internet required after model download.
- Install Ollama: https://ollama.com/
- Pull a model:
ollama pull llama3:70b-instruct - The wizard will auto-detect Ollama and configure it
- Recommended for: Privacy-sensitive data, offline work, unlimited usage
- Benefits: Zero cost, 100% local, no rate limits, offline capable
-
NCBI API Key (Optional)
- Visit NCBI E-utilities
- Enhances literature search capabilities
- The wizard offers to add this optionally
Advanced Users: You can skip the wizard by manually creating both config files (see Manual Configuration section below).
Pre-Installation Check
Before installing, verify your Python version:
python3 --version # Should be 3.12+Platform-Specific Recommendations:
- macOS/Linux: PyPI installation recommended
- Windows: PyPI or Docker Desktop
Installation Methods
Lobster AI can be installed in two ways:
- Global Installation - Command available system-wide
- Local Installation - Command available in virtual environment only
Comparison: Global vs Local
| Aspect | Global Installation | Local Installation |
|---|---|---|
| Access | From anywhere in terminal | Only in activated venv |
| Command | uv tool install lobster-ai | pip install lobster-ai |
| Isolation | Separate per tool | Separate per project |
| Best For | CLI usage, quick analysis | Development, project-specific |
| Upgrade | uv tool upgrade lobster-ai | pip install -U lobster-ai |
| Uninstall | uv tool uninstall lobster-ai | pip uninstall lobster-ai |
Choose based on your workflow:
- CLI power user? → Global installation (recommended)
- Multiple projects with different versions? → Local installation
- Developer contributing to Lobster? → Local from source
Method 1: One-Line Install (Recommended)
The easiest installation method — handles everything automatically:
curl -fsSL https://install.lobsterbio.com | bashirm https://install.lobsterbio.com/windows | iexWhat this does:
- Installs
uv(fast Python package manager) if needed - Installs Python 3.12 if needed
- Prompts you to choose LLM provider and agent bundles
- Installs Lobster AI with
uv tool install - Runs
lobster initto configure API keys - Ready to use:
lobster chat
After installation:
# Start chatting
lobster chat
# Check version
lobster --versionMethod 2: Development Install (From Source)
For contributors and developers:
# Clone the repository
git clone https://github.com/the-omics-os/lobster.git
cd lobster
# Install with development dependencies
make dev-installAdditional features:
- Testing framework (pytest, coverage)
- Code quality tools (black, isort, pylint, mypy)
- Pre-commit hooks
- Documentation tools
Method 3: Manual Installation
For full control over the installation process:
# Create virtual environment
python3 -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install uv (recommended for faster installation)
# https://docs.astral.sh/uv/getting-started/installation/
# Install Lobster AI with uv (recommended)
uv pip install lobster-ai
# Alternative: pip install lobster-aiMethod 4: Global Installation
Install the lobster command globally (Unix/macOS):
# First, install locally
make install
# Then install globally
make install-globalThis creates a symlink in /usr/local/bin/lobster allowing you to run lobster from anywhere.
Method 5: PyPI Installation (Recommended)
The lobster-ai package is available on PyPI with two installation modes:
5A. Global Installation with uv tool (Recommended for CLI Use)
Best for: System-wide access, quick analysis, CLI usage
# Install uv if not already installed
# https://docs.astral.sh/uv/getting-started/installation/
# Install Lobster globally with all free agents + your provider
uv tool install "lobster-ai[full,anthropic]"
# Or with Ollama (local, no API key needed)
uv tool install "lobster-ai[full,ollama]"
# Or minimal install (add agents later)
uv tool install lobster-ai
# Configure API keys
lobster init
# Start using
lobster chatAvailable extras:
full— All free agents + optional capabilities (includesvector-search)anthropic— Anthropic Claude providerbedrock— AWS Bedrock providergemini— Google Gemini provideropenai— OpenAI GPT-4o and reasoning models providerazure— Azure AI Foundry providerollama— Local Ollama providervector-search— Semantic ontology matching (ChromaDB + SapBERT)
Benefits:
- ✅ Accessible from any directory
- ✅ Clean uninstall (one command)
- ✅ Isolated environment per tool
- ✅ Automatic PATH management
- ✅ No virtual environment activation needed
Upgrading:
# Upgrade to latest version
uv tool upgrade lobster-ai
# Add more agents after install
lobster init --force
# This generates the correct uv tool install command with --with flags5B. Local Installation with Virtual Environment
Best for: Project-specific installation, multiple versions, development
# Create virtual environment
python3 -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
# Install with uv (faster)
uv pip install lobster-ai
# Or use pip
pip install lobster-ai
# Verify installation (only works when venv activated)
lobster --version
# Configure
lobster init
# Start using
lobster chatBenefits:
- ✅ Project-specific versions
- ✅ Doesn't affect system Python
- ✅ Easy to remove (delete directory)
- ✅ Standard Python workflow
To use later:
# Always activate virtual environment first
source .venv/bin/activate
lobster chat5C. Alternative: pipx (Similar to uv tool)
For users who already use pipx:
# Install pipx if not already installed
# https://pipx.pypa.io/stable/installation/
# Install Lobster globally with pipx
pipx install lobster-ai
# Uninstall
pipx uninstall lobster-aiNote: pipx and uv tool are similar - choose whichever you prefer.
Configuration (All Methods):
After installation, run the configuration wizard to set up your API keys:
# Launch interactive configuration wizard
lobster init
# The wizard will guide you through:
# 1. Choose LLM provider (Anthropic, AWS Bedrock, or Ollama)
# 2. Enter your API keys securely (input is masked)
# 3. Optionally add NCBI API key for enhanced literature search
# 4. Configuration saved to TWO files:
# - provider_config.json (provider/model selection - safe to commit)
# - .env (API keys/secrets - never commit)Additional configuration commands:
# Test API connectivity
lobster config test
# View current configuration (secrets masked)
lobster config show
# Reconfigure (creates backup of existing .env)
lobster init --force
# Non-interactive mode (for CI/CD)
lobster init --non-interactive --anthropic-key=sk-ant-xxxThe wizard creates two configuration files:
.lobster_workspace/provider_config.json- Provider/model selection.env- API keys and secrets
Security Note:
- ✅
provider_config.jsoncan be safely committed to git (no secrets) - ❌
.envmust NEVER be committed (contains API keys)
No manual file editing required!
Manual Configuration (Advanced)
If you prefer to skip the wizard, create both configuration files manually:
File 1: .lobster_workspace/provider_config.json (provider selection - safe to commit)
{
"global_provider": "anthropic",
"anthropic_model": "claude-sonnet-4-20250514",
"profile": "production"
}Available providers: anthropic, bedrock, ollama, gemini, openai, azure
Available profiles: development, production, ultra, godmode
File 2: .env (API keys - never commit, add to .gitignore)
# Anthropic API
ANTHROPIC_API_KEY=sk-ant-api03-your-key-here
# AWS Bedrock
AWS_BEDROCK_ACCESS_KEY=your-access-key
AWS_BEDROCK_SECRET_ACCESS_KEY=your-secret-key
# Google Gemini
GOOGLE_API_KEY=your-google-api-key
# Ollama (local)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_DEFAULT_MODEL=llama3:70b-instruct
# Optional: NCBI
NCBI_API_KEY=your-ncbi-keySecurity Best Practices:
# Add .env to .gitignore
echo ".env" >> .gitignore
# Verify .env is not tracked
git check-ignore .env # Should output: .env
# provider_config.json CAN be committed (no secrets)
git add .lobster_workspace/provider_config.jsonNote: For development or contributing to Lobster AI, use Method 1 (Quick Install) or Method 3 (Manual Installation) to install from source.
Platform-Specific Instructions
macOS
One-Line Install (Recommended):
curl -fsSL https://install.lobsterbio.com | bashThis handles everything: installs uv, Python 3.12, and Lobster AI.
Manual Setup:
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install Lobster AI globally
uv tool install "lobster-ai[full,anthropic]"
# Configure and start
lobster init
lobster chatUsing Homebrew:
# Install uv via Homebrew
brew install uv
# Install Lobster AI
uv tool install "lobster-ai[full,anthropic]"
lobster initInstallation Choice:
- Global (recommended): Use
uv tool install lobster-ai(accessible anywhere) - Local: Use virtual environment method (project-specific)
See Installation Methods for detailed comparison.
Linux (Ubuntu/Debian)
One-Line Install (Recommended):
curl -fsSL https://install.lobsterbio.com | bashThis handles system dependencies, uv, Python 3.12, and Lobster AI.
Manual Setup:
System dependencies are required for compilation:
# 1. Install system dependencies
sudo apt update
sudo apt install -y \
build-essential \
python3.12-dev \
python3.12-venv \
pkg-config \
libhdf5-dev \
libxml2-dev \
libxslt1-dev \
libffi-dev \
libssl-dev \
libblas-dev \
liblapack-dev
# 2. Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# 3. Install Lobster AI
uv tool install "lobster-ai[full,anthropic]"
# 4. Configure and start
lobster init
lobster chatWhy These Packages Are Required:
build-essential: C/C++ compilers (gcc, g++, make)python3.12-dev: Python header files for building extensionslibhdf5-dev: HDF5 file format support (required for AnnData)libblas-dev,liblapack-dev: Linear algebra libraries (required for NumPy/SciPy)libxml2-dev,libxslt1-dev: XML parsing (required for web scraping)libffi-dev,libssl-dev: Cryptography and SSL support
Ubuntu Version Notes:
- Ubuntu 22.04 LTS: Requires PPA for Python 3.12
sudo add-apt-repository ppa:deadsnakes/ppa sudo apt update sudo apt install python3.12 python3.12-venv python3.12-dev - Ubuntu 24.04 LTS: Python 3.12+ available in default repositories
CentOS/RHEL/Fedora:
# Install Python 3.12+
sudo dnf install python3.12 python3.12-devel
# Install development tools and libraries
sudo dnf groupinstall "Development Tools"
sudo dnf install hdf5-devel libxml2-devel libxslt-devel \
openssl-devel libffi-devel blas-devel lapack-devel
# Install Lobster AI
pip install lobster-ai
lobster initInstallation Choice:
- Global: Use
uv pip install lobster-ai(accessible anywhere) - Local: Use virtual environment method (project-specific)
See Installation Methods for detailed comparison.
Windows
One-Line Install (Recommended):
Open PowerShell and run:
irm https://install.lobsterbio.com/windows | iexThis handles uv, Python 3.12, and Lobster AI installation.
Manual Installation:
# 1. Install uv
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
# 2. Install Lobster AI
uv tool install "lobster-ai[full,anthropic]"
# 3. Update PATH (may require terminal restart)
uv tool update-shell
# 4. Configure and start
lobster init
lobster chatOption: Docker Desktop
Docker provides an alternative experience on Windows:
# 1. Install Docker Desktop for Windows
# Download from: https://www.docker.com/products/docker-desktop/
# 2. Run Lobster in Docker
docker run -it --rm -e ANTHROPIC_API_KEY=your-key lobster-ai chatOption: Windows Subsystem for Linux (WSL)
WSL 2 provides a full Linux environment:
# 1. Install WSL 2
wsl --install
# 2. Open Ubuntu terminal and use Linux installer
curl -fsSL https://install.lobsterbio.com | bashOption 3: Windows Subsystem for Linux (WSL)
WSL 2 provides a full Linux environment with excellent performance:
# 1. Install WSL 2
wsl --install
# 2. Install Ubuntu from Microsoft Store
# 3. Open Ubuntu terminal and follow Linux installation instructions
pip install lobster-ai
./install-ubuntu.shCommon Windows Issues:
- Compiler errors: Install Visual Studio Build Tools or use Docker
- Permission denied: Run PowerShell as Administrator or use Docker
- Python not found: Reinstall Python with "Add to PATH" checked
- Long path errors: Enable long path support in Windows Registry or use Docker
Installation Choice:
- Global: Use
uv pip install lobster-ai(accessible anywhere) - Local: Use virtual environment method (project-specific)
See Installation Methods for detailed comparison.
Python Version Considerations
Python 3.12+ Requirements:
- pyproject.toml specifies:
>=3.12 - Makefile enforces:
>=3.12 - Recommendation: Use Python 3.12+ for best performance
Installing Specific Python Version:
# macOS with pyenv
brew install pyenv
pyenv install 3.12.0
pyenv local 3.12.0
# Ubuntu with deadsnakes PPA
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt install python3.12 python3.12-venvVerification
Test Installation
After installation, verify everything works:
# Activate environment
source .venv/bin/activate # Linux/macOS
# .venv\Scripts\activate # Windows
# Test CLI
lobster --help
# Test imports
python -c "import lobster; print('✅ Lobster imported successfully')"
# Run verification script
python verify_installation.pyConfiguration Wizard
After installation, run the configuration wizard to set up your API keys:
# Launch interactive configuration wizard
lobster initExpected wizard output:
╭────────────────────────────────────────────────────────────╮
│ 🦞 Welcome to Lobster AI! │
│ │
│ Let's set up your LLM provider. │
╰────────────────────────────────────────────────────────────╯
Select your LLM provider:
1 - Anthropic API - Best quality, quick setup
2 - AWS Bedrock - Production, enterprise use
3 - Ollama (Local) - Privacy, zero cost, offline
Choose provider [1]: 1
🔑 Anthropic API Configuration
Get your API key from: https://console.anthropic.com/
Enter your API key: ********************************
📚 NCBI API Key (Optional)
Enhances literature search capabilities.
Add NCBI API key? [y/N]: n
╭────────────────────────────────────────────────────────────╮
│ ✅ Configuration saved! │
│ │
│ Files created: │
│ • .env (API keys - never commit to git) │
│ • .lobster_workspace/provider_config.json (versioned) │
│ │
│ Next step: Run lobster chat │
╰────────────────────────────────────────────────────────────╯Configuration management commands:
# Test API connectivity
lobster config test
# View current configuration (secrets masked)
lobster config show
# Reconfigure (creates timestamped backup of existing .env)
lobster init --forceCheck System Status
# After configuration, check status in chat
lobster chat
# In the chat interface, type:
/statusExpected output:
✅ System Status: Healthy
✅ Environment: Virtual environment active
✅ Dependencies: All packages installed
✅ Configuration: .env file present
✅ API Keys: ConfiguredWorkspace Location
Lobster stores downloaded datasets, analysis results, plots, and exports in a workspace directory. By default, this is .lobster_workspace/ in your current working directory.
Default behavior:
# Running lobster creates workspace in current directory
cd /path/to/my/project
lobster chat
# Creates: /path/to/my/project/.lobster_workspace/Override options:
- CLI flag (highest priority):
lobster chat --workspace /custom/path/to/workspace
lobster query "analyze my data" --workspace ~/my_lobster_workspace- Environment variable:
# Add to ~/.bashrc or ~/.zshrc for persistence
export LOBSTER_WORKSPACE=/path/to/shared/workspace
# Or set per-session
LOBSTER_WORKSPACE=/tmp/test_workspace lobster chatResolution order: CLI --workspace flag > LOBSTER_WORKSPACE env var > current directory default
Workspace structure:
.lobster_workspace/
├── data/ # Downloaded and processed datasets (.h5ad files)
├── exports/ # Exported notebooks and reports
├── plots/ # Generated visualizations
├── cache/ # Temporary cache files
└── .session.json # Session stateBest practices:
- Use the same workspace directory across related analyses for data sharing
- Set
LOBSTER_WORKSPACEfor team environments or CI/CD pipelines - Keep workspaces in version-controlled project directories for reproducibility
Verify API Connectivity
# Test API connectivity and validate configuration
lobster config testThis command will:
- Check for .env file existence
- Test LLM provider (Claude API or AWS Bedrock) connectivity
- Test NCBI API if configured
- Display detailed test results with ✅/❌ status for each service
🤖 Using Claude Code, Codex, or Gemini CLI? Install Lobster skills — your AI assistant will instantly know how to use Lobster for bioinformatics analysis.
curl -fsSL https://skills.lobsterbio.com | bashNo documentation reading required. Just ask: "Analyze my RNA-seq data" and your AI handles the rest.
Uninstalling Lobster AI
Remove Package
The uninstall process depends on how you installed Lobster:
Global Installation (uv tool)
# One command to remove everything
uv tool uninstall lobster-ai
# Verify removal
uv tool list # Should not show lobster-ai
which lobster # Should output nothing or "not found"✅ Clean: Removes the tool and its isolated environment completely.
Global Installation (pipx)
# One command to remove everything
pipx uninstall lobster-ai
# Verify removal
pipx list | grep lobster # Should output nothing
which lobster # Should output nothingLocal Installation (Virtual Environment)
# Activate virtual environment first
source .venv/bin/activate # Windows: .venv\Scripts\activate
# Uninstall package
pip uninstall lobster-ai
# Deactivate virtual environment
deactivate
# Optional: Remove entire virtual environment
rm -rf .venvSource Installation (make install-global)
cd /path/to/lobster
# Remove global symlink
make uninstall-global
# Optional: Remove virtual environment
make uninstallRemove User Data (Optional)
⚠️ Warning: This permanently deletes all your analysis data!
Lobster stores user data separately from the package:
# View what will be removed
du -sh ~/.lobster ~/.lobster_workspace
# Remove command history, notebooks, workspaces
rm -rf ~/.lobster
# Remove cache, data, exports
rm -rf ~/.lobster_workspace
# Remove project configurations (in your project directories)
cd /path/to/your/project
rm .env # API key configurationWhat gets removed:
~/.lobster/: Command history, exported notebooks, saved workspaces~/.lobster_workspace/: Download cache, intermediate data, exports.envfiles: API key configurations (project-specific)
Keep user data if:
- You plan to reinstall Lobster later
- You want to preserve analysis history
- You have important exported notebooks
Verify Complete Removal
# Check command removed
which lobster # Should output: lobster not found
# Check package removed
pip list | grep lobster # Should output nothing
uv tool list | grep lobster # Should output nothing (if using uv tool)
# Check user data removed
ls ~/.lobster 2>/dev/null || echo "✅ Removed"
ls ~/.lobster_workspace 2>/dev/null || echo "✅ Removed"Troubleshooting Uninstall
Problem: which lobster still shows a command after uninstall
Solution:
# Find the leftover executable
which lobster
ls -la $(which lobster)
# Remove manually (location varies by installation)
# For uv tool (shouldn't happen, but just in case)
rm ~/.local/bin/lobster
# For Homebrew Python
sudo rm /opt/homebrew/bin/lobster
# For system Python
sudo rm /usr/local/bin/lobster
# or
rm ~/Library/Python/3.x/bin/lobsterProblem: Package still shows in pip list after uninstall
Solution:
# Force remove
pip uninstall -y lobster-ai
# Or use uv
uv pip uninstall lobster-aiDevelopment Installation
Full Development Setup
# Clone the repository
git clone https://github.com/the-omics-os/lobster.git
cd lobster
# Development installation
make dev-install
# This installs:
# - All runtime dependencies
# - Testing framework (pytest, pytest-cov, pytest-xdist)
# - Code quality (black, isort, flake8, pylint, mypy)
# - Security tools (bandit)
# - Documentation (mkdocs)
# - Pre-commit hooksDevelopment Commands
# Run tests
make test
# Fast parallel testing
make test-fast
# Code formatting
make format
# Linting
make lint
# Type checking
make type-check
# Clean installation
make clean-installPre-commit Hooks
Development installation automatically sets up pre-commit hooks:
# Manual setup if needed
make setup-pre-commit
# Run on all files
pre-commit run --all-filesOptional Dependencies
These optional components enhance Lobster AI with advanced features. Install based on your analysis needs.
PyMOL (Protein Structure Visualization)
PyMOL enables 3D protein structure visualization and analysis (v0.2+).
Automated Installation (macOS):
cd lobster
make install-pymolManual Installation:
macOS
# Via Homebrew
brew install brewsci/bio/pymol
# Verify installation
pymol -c -QLinux (Ubuntu/Debian)
# Via apt
sudo apt-get update
sudo apt-get install pymol
# Verify installation
which pymol
pymol -c -QLinux (Fedora/RHEL)
# Via DNF
sudo dnf install pymol
# Verify installation
pymol -c -QDocker
PyMOL is pre-installed in the Docker image - no additional setup needed.
Usage:
# In Lobster chat
🦞 You: "Fetch protein structure 1AKE"
🦞 You: "Visualize 1AKE with PyMOL mode=interactive style=cartoon"
🦞 You: "Link protein structures to my RNA-seq data"Troubleshooting: If PyMOL is not found, check installation:
which pymol
pymol --versionSee Protein Structure Visualization Guide for complete usage details.
Docling (Advanced PDF Parsing)
Docling provides professional-grade PDF parsing for extracting methods from scientific publications (v0.2+).
Installation:
# Basic Docling
pip install docling
# Full installation with all features
pip install "docling[all]"
# With table extraction
pip install "docling[table]"
# With OCR support
pip install "docling[ocr]"Verify Installation:
python -c "from docling.document_converter import DocumentConverter; print('✓ Docling installed')"Benefits:
- >90% Methods section detection (vs 30% with PyPDF2 fallback)
- Table extraction from scientific papers
- Formula recognition in publications
- Better structure detection for complex PDFs
Usage:
# Docling is used automatically by ContentAccessService
🦞 You: "Extract methods from PMID:38448586"
🦞 You: "Read full publication PMID:35042229"Fallback Behavior: If Docling is not installed, Lobster automatically falls back to PyPDF2 with reduced functionality.
Troubleshooting:
# Test Docling functionality
python -c "import docling; print(docling.__version__)"
# Check dependencies
pip list | grep doclingSee Publication Intelligence Guide for technical details.
Semantic Search (Ontology Matching)
Semantic search enables matching biomedical terms against standardized ontologies using vector embeddings (v1.0.7+).
Installation:
# Standalone
pip install 'lobster-ai[vector-search]'
# Included in full install
pip install 'lobster-ai[full]'
# Via lobster init (interactive prompt)
lobster initThe lobster init wizard prompts for semantic search installation after the Docling prompt. You can also use --install-vector-search for non-interactive setup.
Verify Installation:
python -c "import chromadb; import sentence_transformers; print('Semantic Search available')"What it provides:
- ~30K disease concepts (MONDO ontology)
- ~15K anatomy/tissue terms (UBERON ontology)
- ~2.5K cell types (Cell Ontology)
- SapBERT biomedical embeddings (768-dim, trained on UMLS)
- ChromaDB persistent local vector store with S3 auto-download
Usage:
# Semantic matching is used automatically by agents when installed
lobster chat
You: "Match glioblastoma to MONDO ontology"
You: "Standardize tissue annotations to UBERON terms"Fallback Behavior:
Without vector-search installed, agents fall back to keyword matching (4 hardcoded diseases). No errors — just reduced matching coverage.
Disk Space: ~800 MB total (packages + model + ontology data). Data is cached locally after first download.
See Semantic Search Guide for the full reference and Optional Dependencies for troubleshooting.
AWS Bedrock (Enhanced Setup)
Detailed AWS Bedrock configuration for production deployments.
Step 1: Create AWS Account
- Visit AWS Console
- Create account or sign in
- Navigate to AWS Bedrock service
Step 2: Request Model Access
# Navigate to: AWS Bedrock → Model Access
# Request access to: Claude 3.5 Sonnet, Claude 3 Opus
# Approval typically takes 1-2 business daysStep 3: Create IAM User
# AWS Console → IAM → Users → Create User
# User name: lobster-ai-user
# Access type: Programmatic access
# Attach policy: AmazonBedrockFullAccess
# OR create custom policy (recommended):Custom IAM Policy (Least Privilege):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream",
"bedrock:ListFoundationModels"
],
"Resource": "*"
}
]
}Step 4: Configure Credentials
# Option 1: AWS CLI configuration (recommended)
aws configure
# Enter: Access Key ID, Secret Access Key, Region (us-east-1), Output format (json)
# Option 2: Environment variables
export AWS_BEDROCK_ACCESS_KEY=AKIAIOSFODNN7EXAMPLE
export AWS_BEDROCK_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
export AWS_DEFAULT_REGION=us-east-1
# Option 3: .env file (for Lobster)
cat >> .env << EOF
AWS_BEDROCK_ACCESS_KEY=AKIAIOSFODNN7EXAMPLE
AWS_BEDROCK_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
AWS_DEFAULT_REGION=us-east-1
EOFStep 5: Verify Access
# Test Bedrock connectivity
aws bedrock list-foundation-models --region us-east-1
# Test in Lobster
lobster chat
> /status
# Should show: "Model: AWS Bedrock (Claude)"Troubleshooting AWS Bedrock:
# Check credentials
aws sts get-caller-identity
# Test model access
aws bedrock list-foundation-models --region us-east-1 | grep Claude
# Common issues:
# 1. Model access not approved → Wait for approval or request again
# 2. Wrong region → Bedrock availability varies by region
# 3. IAM permissions → Verify user has bedrock:InvokeModel permissionRegional Availability: AWS Bedrock Claude models are available in:
us-east-1(US East, N. Virginia) - Recommendedus-west-2(US West, Oregon)eu-west-1(Europe, Ireland)ap-southeast-1(Asia Pacific, Singapore)
See AWS Bedrock Regions for current availability.
Cloud Mode Configuration
Enable cloud processing for large-scale analyses (v0.2+).
Setup:
# 1. Request cloud API key
# Email: info@omics-os.com
# Subject: "Omics-OS Cloud API Key Request"
# Include: Organization name, use case, expected usage
# 2. Configure API key
export LOBSTER_CLOUD_KEY="your-cloud-api-key-here"
# 3. Start Lobster in cloud mode
lobster chat
# 4. Verify cloud mode active
> /status
# Should show: "Cloud mode: active"Benefits:
- Scalable compute for datasets >100K cells
- No local memory limits for large datasets
- Faster processing with distributed infrastructure
- Automatic resource management
Usage:
# Cloud mode is automatic when LOBSTER_CLOUD_KEY is set
🦞 You: "Download GSE123456 and analyze with cloud resources"
🦞 You: "Process this large dataset using cloud infrastructure"
# Switch back to local mode
unset LOBSTER_CLOUD_KEY
lobster chatCost Structure:
- Free tier: 10 analyses/month
- Pro tier: $6K-$18K/year (based on usage)
- Enterprise: Custom pricing
Troubleshooting Cloud Mode:
# Check API key is set
echo $LOBSTER_CLOUD_KEY
# Test cloud connectivity
lobster chat
> /status
# Common issues:
# 1. API key not set → Export LOBSTER_CLOUD_KEY
# 2. Key expired → Request new key from info@omics-os.com
# 3. Network timeout → Check firewall/proxy settingsSee Configuration Guide for complete cloud setup details.
Docker Deployment
Lobster supports Docker for both CLI and FastAPI server modes. For comprehensive deployment guides, see Docker Deployment Guide.
Quick Start with Docker
Unix/macOS/Linux (using Makefile):
# 1. Build images
make docker-build
# 2. Run CLI interactively
make docker-run-cli
# 3. Or run FastAPI server
make docker-run-serverWindows (using PowerShell):
# 1. Build CLI image
docker build -t lobster:latest -f Dockerfile .
# 2. Run CLI interactively
docker run -it --rm \
--env-file .env \
-v ${PWD}/data:/app/data \
-v lobster-workspace:/app/.lobster_workspace \
lobster:latest chat
# 3. Or run FastAPI server
docker build -t lobster:server -f Dockerfile.server .
docker run -d --name lobster-api -p 8000:8000 --env-file .env lobster:serverBuild Docker Images
Unix/macOS/Linux:
# Build both CLI and server images (using Makefile)
make docker-build
# Or manually
docker build -t lobster:latest -f Dockerfile .
docker build -t lobster:server -f Dockerfile.server .Windows (PowerShell):
# Build CLI image
docker build -t lobster:latest -f Dockerfile .
# Build server image (optional, for FastAPI mode)
docker build -t lobster:server -f Dockerfile.server .Note for Windows users: The make command is not available by default on Windows. Use the manual docker build commands shown above.
Run CLI with Docker
Unix/macOS/Linux:
# Using Makefile (recommended)
make docker-run-cli
# Or manually with environment file
docker run -it --rm \
--env-file .env \
-v $(pwd)/data:/app/data \
-v lobster-workspace:/app/.lobster_workspace \
lobster:latest chat
# Single query mode (automation)
docker run --rm \
--env-file .env \
-v $(pwd)/data:/app/data \
lobster:latest query "download GSE12345"Windows (PowerShell):
# Interactive chat mode
docker run -it --rm `
--env-file .env `
-v ${PWD}/data:/app/data `
-v lobster-workspace:/app/.lobster_workspace `
lobster:latest chat
# Single query mode (automation)
docker run --rm `
--env-file .env `
-v ${PWD}/data:/app/data `
lobster:latest query "download GSE12345"
# With individual environment variables (if .env file not available)
docker run -it --rm `
-e ANTHROPIC_API_KEY=your-key-here `
-v ${PWD}/data:/app/data `
-v lobster-workspace:/app/.lobster_workspace `
lobster:latest chatWindows Notes:
- Use backtick (`) for line continuation in PowerShell
- Use
$\{PWD\}to reference current directory - Named volumes (like
lobster-workspace) work the same on all platforms
Run FastAPI Server with Docker
Unix/macOS/Linux:
# Using Makefile (recommended)
make docker-run-server
# Or manually
docker run -d \
--name lobster-api \
-p 8000:8000 \
--env-file .env \
-v $(pwd)/data:/app/data \
lobster:server
# Check server health
curl http://localhost:8000/health
# Stop server
docker stop lobster-apiWindows (PowerShell):
# Run server in detached mode
docker run -d `
--name lobster-api `
-p 8000:8000 `
--env-file .env `
-v ${PWD}/data:/app/data `
lobster:server
# Check server health (PowerShell)
Invoke-WebRequest -Uri http://localhost:8000/health
# Or use curl if installed
curl http://localhost:8000/health
# View server logs
docker logs lobster-api
# Stop server
docker stop lobster-api
# Remove stopped container
docker rm lobster-apiDocker Compose
Unix/macOS/Linux:
# Run CLI interactively
make docker-compose-cli
# Start FastAPI server in background
make docker-compose-up
# View logs
docker-compose logs -f lobster-server
# Stop all services
make docker-compose-downWindows (PowerShell):
# Run CLI interactively
docker-compose run --rm lobster-cli chat
# Start FastAPI server in background
docker-compose up -d lobster-server
# View logs
docker-compose logs -f lobster-server
# Stop all services
docker-compose downdocker-compose.yml supports both CLI and server modes. See Docker Deployment Guide for full configuration details.
Note: Docker Compose works identically on Windows, macOS, and Linux. The commands are the same across platforms.
Troubleshooting
Common Installation Issues
Python Version Problems
Error: Python 3.12+ is required
Solutions:
# Check Python version
python --version
python3 --version
# Install Python 3.12+
# macOS: brew install python@3.12
# Ubuntu: sudo apt install python3.12
# Windows: Download from python.org
# Use specific Python version
python3.12 -m venv .venvVirtual Environment Issues
Error: Failed to create virtual environment
Solutions:
# Install venv module (Ubuntu/Debian)
sudo apt install python3.12-venv
# Clear existing environment
rm -rf .venv
# Create manually
python3 -m venv .venv --clear
# Alternative method
python3 -m venv .venv --without-pip
source .venv/bin/activate
curl https://bootstrap.pypa.io/get-pip.py | pythonDependency Installation Failures
Error: Failed building wheel for [package]
Solutions:
# Install development headers (Linux)
sudo apt install python3.12-dev build-essential
# Update pip and setuptools
pip install --upgrade pip setuptools wheel
# Clear pip cache
pip cache purge
# Install with no cache
pip install --no-cache-dir -e .
# Use uv for faster, more reliable installs
pip install uv
uv pip install -e .Permission Errors
Error: Permission denied
Solutions:
# Don't use sudo with pip in virtual environment
# Instead, ensure virtual environment ownership
chown -R $USER:$USER .venv
# For global installation (Unix only)
sudo make install-globalMemory Issues During Installation
Error: Killed or memory-related errors
Solutions:
# Increase swap space (Linux)
sudo fallocate -l 2G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
# Install with limited parallelism
pip install -e . --no-build-isolation
# Use development profile for lighter resource usage
export LOBSTER_PROFILE=development
make installRuntime Issues
API Key Problems
Error: API key not found or invalid
Solutions:
# Check environment variables
echo $ANTHROPIC_API_KEY # For Claude API
echo $AWS_BEDROCK_ACCESS_KEY # For AWS Bedrock
source .env # Load from file
# Test API connectivity
lobster config test
# Regenerate API keys if neededImport Errors
Error: ModuleNotFoundError: No module named 'lobster'
Solutions:
# Ensure virtual environment is activated
source .venv/bin/activate
# Reinstall in development mode
pip install -e .
# Check PYTHONPATH
python -c "import sys; print(sys.path)"Memory Issues During Analysis
Solutions:
# Use development profile (lighter than production)
export LOBSTER_PROFILE=development
# Reduce file size limits
export LOBSTER_MAX_FILE_SIZE_MB=100
# Monitor memory usage
htop # Linux/macOS
# Task Manager on WindowsGetting Additional Help
Check System Health
lobster chat
/dashboard # Comprehensive system overview
/status # Quick status checkEnable Debug Mode
# Verbose logging
lobster chat --debug --verbose
# Show reasoning
lobster chat --reasoningLog Files
# Check logs in workspace
ls .lobster_workspace/logs/
# Enable detailed logging
export LOBSTER_LOG_LEVEL=DEBUGCommunity Support
- GitHub Issues: Report bugs
- Discord: Join community
- Email: Direct support
- Documentation: Full docs
Clean Reinstallation
If all else fails, perform a clean reinstallation:
# Remove everything
make uninstall # Remove virtual environment
make clean # Remove build artifacts
rm -rf .lobster_workspace # Remove workspace (optional)
# Fresh installation
make clean-install
# Or manually
rm -rf .venv
git clean -fdx # Warning: removes all untracked files
make installNext Steps: Once installation is complete, see the Configuration Guide to set up API keys and customize your Lobster AI environment.