ai_client_factory module¶
AI Client Factory - Creates and manages different AI clients with fallback logic
- class ai_client_factory.AIClientFactory(base_dir='.')[source]¶
Bases:
object
Factory for creating AI clients with automatic provider selection and fallback
- create_client()[source]¶
Create an AI client based on configuration and availability
Provider selection logic: 1. If AI_PROVIDER is set to specific provider, try that first 2. If “auto” or not set, try providers in order: Llama -> Claude -> Sample 3. If fallback is enabled, continue down the chain until one works 4. If no providers work, return a mock client with sample content
- Return type:
- class ai_client_factory.SampleAIClient(base_dir='.')[source]¶
Bases:
BaseAIClient
Fallback AI client that provides sample content
- generate_all_cover_letter_content(job_description, profile_content, company_name, position_title)[source]¶
Generate sample cover letter content
- Return type:
Classes¶
- class ai_client_factory.AIClientFactory(base_dir='.')[source]¶
Factory for creating AI clients with automatic provider selection and fallback
- create_client()[source]¶
Create an AI client based on configuration and availability
Provider selection logic: 1. If AI_PROVIDER is set to specific provider, try that first 2. If “auto” or not set, try providers in order: Llama -> Claude -> Sample 3. If fallback is enabled, continue down the chain until one works 4. If no providers work, return a mock client with sample content
- Return type:
Factory Methods¶
- AIClientFactory.create_client()[source]¶
Create an AI client based on configuration and availability
Provider selection logic: 1. If AI_PROVIDER is set to specific provider, try that first 2. If “auto” or not set, try providers in order: Llama -> Claude -> Sample 3. If fallback is enabled, continue down the chain until one works 4. If no providers work, return a mock client with sample content
- Return type:
Provider Selection¶
The factory implements an intelligent fallback chain:
Llama/Ollama - Local AI models (privacy-focused)
Claude API - Cloud-based AI (high quality)
Sample Content - Built-in fallback (always available)
Usage Examples¶
Basic Usage¶
from ai_client_factory import AIClientFactory
# Create factory with auto-selection
factory = AIClientFactory()
client = factory.create_client()
# Generate content
content = client.generate_einstiegstext(
job_description="...",
profile_content="...",
company_name="Example Corp",
position_title="Software Engineer"
)
Provider Testing¶
# Test all available providers
factory = AIClientFactory()
results = factory.test_all_providers()
for provider, result in results.items():
status = "✅" if result["available"] else "❌"
print(f"{status} {provider}: {result}")
Forced Provider Selection¶
import os
# Force specific provider
os.environ["AI_PROVIDER"] = "claude"
factory = AIClientFactory()
client = factory.create_client() # Will use Claude only
Configuration¶
Environment Variables¶
AI_PROVIDER:
auto
,claude
,llama
,sample
AI_ENABLE_FALLBACK:
true
/false
(enable fallback chain)ANTHROPIC_API_KEY: Claude API key
LLAMA_MODEL: Specific Llama model to use
OLLAMA_BASE_URL: Ollama server URL