Shared Library Tool Inventory

Comprehensive catalog of all available tools, providers, clients, and orchestrators
๐Ÿ“ /home/coolhand/shared/ | Author: Luke Steuber | Updated: 2025-11-19
14 LLM Providers
12 Data Clients
32 Tool Modules
5 Orchestrators
100% Coverage

๐Ÿค– LLM Providers (14)

Unified abstraction layer for chat completion, streaming, image generation, and vision analysis across 14 different AI providers. All inherit from BaseLLMProvider for consistent interfaces.

Import: from shared.llm_providers.factory import ProviderFactory

Anthropic (Claude)

Claude 3.5 Sonnet/Opus/Haiku models with advanced reasoning and 200K context windows
Vision Streaming API Key Required
Models: claude-opus-4-7, claude-sonnet-4-6, claude-haiku-4-5
File: llm_providers/anthropic_provider.py
provider = ProviderFactory.get_provider('anthropic') response = await provider.chat(messages)

Claude Code Provider (Hybrid)

Auto-detects Claude Code context and uses free instance when available, falls back to Anthropic API
Free in Claude Code Vision Streaming
Context Detection: CLAUDE_CODE environment variable
File: llm_providers/claude_code_provider.py
provider = ClaudeCodeProvider() print(f"Mode: {provider.get_mode()}") # 'claude-code' or 'api' response = await provider.chat(messages)

OpenAI (GPT-4)

GPT-4o, GPT-4-turbo, GPT-3.5-turbo with function calling, vision, and DALL-E integration
Vision DALL-E 3 Streaming API Key Required
Models: gpt-4o, gpt-4o-mini, gpt-4-turbo, gpt-3.5-turbo
File: llm_providers/openai_provider.py
provider = ProviderFactory.get_provider('openai') image = await provider.generate_image("A Swiss design poster")

xAI (Grok)

Grok models with real-time data access and vision analysis
Vision Streaming API Key Required
Models: grok-beta, grok-vision-beta
File: llm_providers/xai_provider.py

Mistral AI

Mistral Large, Medium, Small with European data privacy compliance
Streaming API Key Required
Models: mistral-large-latest, mistral-medium-latest, mistral-small-latest
File: llm_providers/mistral_provider.py

Google Gemini

Gemini 1.5 Pro/Flash with multimodal capabilities and long context
Vision Streaming API Key Required
Models: gemini-1.5-pro, gemini-1.5-flash
File: llm_providers/gemini_provider.py

Cohere

Command models optimized for business use cases and embeddings
Streaming API Key Required
Models: command, command-light
File: llm_providers/cohere_provider.py

Perplexity

Sonar models with integrated web search and citation support
Streaming API Key Required
Models: sonar, sonar-medium
File: llm_providers/perplexity_provider.py

Groq

Ultra-fast Llama 3.1 inference with LPU acceleration
Streaming API Key Required
Models: llama-3.1-70b-versatile, llama-3.1-8b-instant
File: llm_providers/groq_provider.py

HuggingFace

Access to 100k+ open-source models via Inference API
Streaming API Key Required
Models: Custom model selection
File: llm_providers/huggingface_provider.py

ElevenLabs

High-quality text-to-speech with voice cloning and multilingual support
Text-to-Speech API Key Required
Voices: Custom voice library
File: llm_providers/elevenlabs_provider.py

Manus

Specialized provider for specific use cases
API Key Required
File: llm_providers/manus_provider.py

๐Ÿ“Š Data Fetching Clients (12)

API wrappers for external data sources with automatic caching, rate limiting, and error handling. All clients follow consistent patterns for authentication and data retrieval.

Import: from shared.data_fetching import CensusClient, ArxivClient, ...

Census Bureau API

Access ACS, SAIPE, population data with automatic FIPS code generation and geographic hierarchies
API Key Required
Data: Demographics, poverty, housing, economic indicators
File: data_fetching/census_client.py
Tool Wrapper: tools/census_tool.py
client = CensusClient() data = client.fetch_acs(variables=['B01001_001E'], state='06')

arXiv

Search and retrieve academic papers from arXiv.org with metadata extraction
No API Key
Data: Research papers, abstracts, citations
File: data_fetching/arxiv_client.py
Tool Wrapper: tools/arxiv_tool.py

Semantic Scholar

Access citation network and paper metadata with influence metrics
No API Key
Data: Citations, references, paper influence scores
File: data_fetching/semantic_scholar.py
Tool Wrapper: tools/semantic_scholar_tool.py

Internet Archive

Search and retrieve archived web pages, books, media from archive.org
No API Key
Data: Historical web snapshots, digital library
File: data_fetching/archive_client.py
Tool Wrapper: tools/archive_tool.py

Wikipedia

Extract article content, summaries, and metadata from Wikipedia
No API Key
Data: Encyclopedia articles, summaries, categories
File: data_fetching/wikipedia_client.py
Tool Wrapper: tools/wikipedia_tool.py

OpenLibrary

Search books, authors, and ISBNs with comprehensive metadata
No API Key
Data: Book metadata, ISBN lookups, author info
File: data_fetching/openlibrary_client.py
Tool Wrapper: tools/openlibrary_tool.py

NASA APIs

APOD, Mars Rover photos, NEO data, and space imagery
API Key Required
Data: Astronomy pictures, Mars photos, asteroid data
File: data_fetching/nasa_client.py
Tool Wrapper: tools/nasa_tool.py

GitHub

Repository search, user data, commit history, and issue tracking
API Key Optional
Data: Repos, commits, issues, pull requests
File: data_fetching/github_client.py
Tool Wrapper: tools/github_tool.py

YouTube

Video metadata, transcripts, channel statistics
API Key Required
Data: Video info, captions, channel data
File: data_fetching/youtube_client.py
Tool Wrapper: tools/youtube_tool.py

News API

Current news articles with source filtering and keyword search
API Key Required
Data: News articles, headlines, sources
File: data_fetching/news_client.py
Tool Wrapper: tools/news_tool.py

Weather

Current weather, forecasts, and historical data
API Key Required
Data: Temperature, conditions, forecasts
File: data_fetching/weather_client.py
Tool Wrapper: tools/weather_tool.py

Finance

Stock quotes, market data, and financial indicators
API Key Required
Data: Stock prices, market trends, financial news
File: data_fetching/finance_client.py
Tool Wrapper: tools/finance_tool.py

๐Ÿ”ง Tool Modules (32)

Tool registry system for dynamic module loading and orchestration. Tools inherit from ToolModuleBase and register schemas and handlers with ToolRegistry. Used by Swarm and Beltalowda orchestrators for multi-agent workflows.

Import: from shared.tools import ToolRegistry

Provider-Specific Tools (14)

One tool module per LLM provider for orchestrator integration
Files:
  • tools/anthropic_tools.py
  • tools/openai_tools.py
  • tools/xai_tools.py
  • tools/mistral_tools.py
  • tools/cohere_tools.py
  • tools/gemini_tools.py
  • tools/perplexity_tools.py
  • tools/groq_tools.py
  • tools/huggingface_tools.py
  • tools/elevenlabs_tools.py

Data Fetching Tools (12)

Tool wrappers for all data fetching clients
Files:
  • tools/census_tool.py
  • tools/arxiv_tool.py
  • tools/semantic_scholar_tool.py
  • tools/archive_tool.py
  • tools/wikipedia_tool.py
  • tools/openlibrary_tool.py
  • tools/nasa_tool.py
  • tools/github_tool.py
  • tools/youtube_tool.py
  • tools/news_tool.py
  • tools/weather_tool.py
  • tools/finance_tool.py

Capability Tools (6)

Specialized tools for image generation, vision analysis, TTS
Files:
  • tools/image_generation_tools.py
  • tools/vision_tools.py
  • tools/tts_tools.py
  • tools/web_search_tool.py

Registry System

Singleton registry for dynamic tool discovery and registration
Key Features:
  • Automatic tool discovery from hive/ directories
  • Schema validation and enforcement
  • Provider-specific tool namespacing
  • Dynamic handler registration
Files:
  • tools/registry.py - Singleton registry
  • tools/module_base.py - Base class for tools
  • tools/provider_registry.py - Provider tracking
registry = ToolRegistry.get_instance() registry.register_tool('search', schema, handler) tools = registry.list_tools()

๐ŸŽญ Orchestrators (5)

Multi-agent orchestration patterns for complex research and analysis tasks. All inherit from BaseOrchestrator with decompose โ†’ execute โ†’ synthesize workflow.

Import: from shared.orchestration import DreamCascadeOrchestrator, DreamSwarmOrchestrator

โš ๏ธ Note: Renamed from Beltalowda/Swarm (Nov 2025). Backward compatibility maintained via aliases.

Dream Cascade Orchestrator

Hierarchical research with 3-tier cascading synthesis: Workers (parallel execution) โ†’ Mid-level synthesis โ†’ Executive synthesis
API Key Required MCP Pattern: dream-cascade
Best For: Deep research, comprehensive analysis, multi-perspective synthesis
Agents: 8 workers + mid-level + executive (configurable)
File: orchestration/dream_cascade_orchestrator.py
MCP Tool: dream_research
Documents: PDF, DOCX, Markdown generation
orchestrator = DreamCascadeOrchestrator() result = await orchestrator.execute_workflow( task="Research quantum computing applications" )

Dream Swarm Orchestrator

Domain-specific multi-agent search with specialized agent types (Academic, Tech, Business, Social, Media)
API Key Required MCP Pattern: dream-swarm
Best For: Focused search, domain expertise, parallel investigation
Agents: 5 specialists (configurable types)
File: orchestration/dream_swarm_orchestrator.py
MCP Tool: dream_search
Documents: PDF, DOCX, Markdown generation
orchestrator = DreamSwarmOrchestrator() result = await orchestrator.execute_workflow( query="Latest AI breakthroughs", allowed_agent_types=['academic', 'tech'] )

Sequential Orchestrator

Linear workflow with step-by-step task execution and state passing
API Key Required
Best For: Pipeline processing, dependent steps, ordered workflows
File: orchestration/sequential_orchestrator.py

Conditional Orchestrator

Decision-tree workflows with conditional branching and routing
API Key Required
Best For: Complex decision logic, adaptive workflows
File: orchestration/conditional_orchestrator.py

Iterative Orchestrator

Refinement loops with quality assessment and iteration control
API Key Required
Best For: Quality improvement, iterative refinement, self-correction
File: orchestration/iterative_orchestrator.py

๐Ÿ’ก Usage Examples

Basic Provider Usage

from shared.llm_providers.factory import ProviderFactory from shared.config import Config # Get configuration config = Config() api_key = config.get_api_key('openai') # Create provider provider = ProviderFactory.get_provider('openai') # Chat completion messages = [ {"role": "system", "content": "You are a helpful assistant"}, {"role": "user", "content": "Explain quantum computing"} ] response = await provider.chat(messages) print(response['content'])

Complexity-Based Model Selection

from shared.llm_providers.factory import ProviderFactory # Automatic model selection based on query complexity model, meta = ProviderFactory.select_model_by_complexity( query="What is Python?", provider_name='openai' ) # Returns: ('gpt-4o-mini', {'complexity': 'simple', 'cost_tier': 'low'}) # Complex query routing model, meta = ProviderFactory.select_model_by_complexity( query="Analyze the architectural tradeoffs between microservices and monoliths", provider_name='anthropic' ) # Returns: ('claude-opus-4-7', {'complexity': 'complex', 'cost_tier': 'high'})

Hybrid Claude Code Provider

from shared.llm_providers.claude_code_provider import ClaudeCodeProvider # Auto-detects context (Claude Code or API) provider = ClaudeCodeProvider() # Check mode if provider.get_mode() == 'claude-code': print("โœ“ Using Claude Code - no API costs!") else: print("โœ“ Using Anthropic API - standard pricing") # Same code works in both contexts response = await provider.chat(messages)

Vision Analysis with AltFlow Pattern

from shared.web.vision_service import generate_alt_text # Generate accessible alt text result = await generate_alt_text( image_data=base64_string, provider_name='anthropic', max_length=700 ) print(result['alt_text']) print(f"Length: {result['length']} chars") print(f"Warnings: {result['warnings']}")

Data Fetching with Caching

from shared.data_fetching import CensusClient # Create client with caching client = CensusClient(use_cache=True, cache_ttl=3600) # Fetch demographic data data = client.fetch_acs( variables=['B01001_001E'], # Total population state='06', # California county='*' # All counties ) # Automatic FIPS code generation and caching

Flask Integration with Async Adapter

from flask import Flask, request, jsonify from shared.llm_providers.factory import ProviderFactory from shared.utils.async_adapter import async_to_sync app = Flask(__name__) provider = ProviderFactory.get_provider('openai') @app.route('/chat', methods=['POST']) @async_to_sync async def chat(): """Route with automatic async/sync bridging""" data = request.json response = await provider.chat(data['messages']) return jsonify(response) # No manual event loop management needed!

Tool Registry Integration

from shared.tools import ToolRegistry, ToolModuleBase # Create custom tool class MyTool(ToolModuleBase): name = "my_tool" display_name = "My Custom Tool" description = "Does something useful" def initialize(self): self.tool_schemas = [{ "name": "search", "description": "Search for information", "parameters": {...} }] async def search(self, query: str): """Tool implementation""" return {"results": [...]} # Register with registry tool = MyTool() tool.register_with_registry() # Use in orchestrator registry = ToolRegistry.get_instance() result = await registry.execute_tool('my_tool.search', {'query': 'test'})

Orchestrator Execution

from shared.orchestration import DreamCascadeOrchestrator # Initialize Dream Cascade orchestrator orchestrator = DreamCascadeOrchestrator( num_agents=8, enable_drummer=True, # Mid-level synthesis enable_camina=True, # Executive synthesis generate_documents=True, document_formats=['pdf', 'markdown'] ) # Execute research workflow result = await orchestrator.execute_workflow( task="Comprehensive analysis of renewable energy trends", title="Renewable Energy Research" ) # Access results print(result['final_synthesis']) print(f"Documents: {result['documents']}")

๐Ÿ”ง System Status

โœ… Registered Tools & Providers

LLM Providers (12/12)
โœ“ OpenAI
โœ“ Anthropic
โœ“ xAI
โœ“ Mistral
โœ“ Cohere
โœ“ Gemini
โœ“ Perplexity
โœ“ Groq
โœ“ HuggingFace
โœ“ Manus
โœ“ ElevenLabs
โœ“ ClaudeCode
Data Clients (12/12)
โœ“ Archive
โœ“ arXiv
โœ“ Census
โœ“ Finance
โœ“ GitHub
โœ“ NASA
โœ“ News
โœ“ OpenLibrary
โœ“ Semantic Scholar
โœ“ Weather
โœ“ Wikipedia
โœ“ YouTube
Orchestrators (5/5)
โœ“ Beltalowda (Hierarchical Research)
โœ“ Swarm (Domain-Specific Search)
โœ“ Sequential (Pipeline)
โœ“ Conditional (Decision Tree)
โœ“ Iterative (Refinement)
Coverage: 100%
All 63 components registered and functional

๐Ÿ”‘ API Key Configuration

API keys should be configured in /home/coolhand/API_KEYS.md or as environment variables. Check marks indicate key presence (not functionality).

Required for LLM Providers
OpenAI (GPT-4, DALL-E) OPENAI_API_KEY
Anthropic (Claude) ANTHROPIC_API_KEY
xAI (Grok) XAI_API_KEY
Mistral MISTRAL_API_KEY
Cohere COHERE_API_KEY
Gemini GEMINI_API_KEY
Perplexity PERPLEXITY_API_KEY
Groq GROQ_API_KEY
HuggingFace HUGGINGFACE_API_KEY
ElevenLabs (TTS) ELEVENLABS_API_KEY
Required for Data Clients
Census Bureau CENSUS_API_KEY
NASA NASA_API_KEY
News API NEWS_API_KEY
Weather WEATHER_API_KEY
YouTube YOUTUBE_API_KEY
GitHub (optional) GITHUB_API_KEY
No API Key Required
โœ“ ClaudeCode (uses Claude Code instance)
โœ“ Archive.org
โœ“ arXiv
โœ“ OpenLibrary
โœ“ Semantic Scholar
โœ“ Wikipedia
โš ๏ธ Configuration Note
Store API keys in /home/coolhand/API_KEYS.md (gitignored)
or set as environment variables before running services

๐Ÿงช Quick Test Commands

Test Provider Registration:
python3 -c "from shared.llm_providers import ProviderFactory; print(ProviderFactory.list_providers())"
Test Data Tools:
python3 -c "from shared.tools import register_data_tools; print(register_data_tools())"
Find Vision Providers:
python3 -c "from shared.llm_providers import ProviderFactory; print(ProviderFactory.find_providers_with_capability('vision'))"
Check Capabilities:
python3 -c "from shared.llm_providers import PROVIDER_CAPABILITIES; import json; print(json.dumps(PROVIDER_CAPABILITIES['openai'], indent=2))"