Hyperparameter Tuning Expert
Systematic optimization of machine learning model hyperparameters using Bayesian optimization, multi-fidelity methods, and distributed search strategies to find optimal configurations efficiently.
SupaScore
83.55Best for
- ▸Optimizing XGBoost hyperparameters using Bayesian optimization with Optuna for tabular data
- ▸Tuning neural network architecture and training hyperparameters with Ray Tune and ASHA pruning
- ▸Setting up distributed hyperparameter search across GPU clusters for large model training
- ▸Implementing multi-fidelity optimization to reduce training time while finding optimal configurations
- ▸Analyzing hyperparameter importance and interaction effects from completed tuning runs
What you'll get
- ●Complete Optuna study configuration with TPE sampler, search space definitions using log-uniform distributions, MedianPruner setup, and cross-validation integration
- ●Ray Tune experiment with ASHA scheduler configuration, resource allocation strategy, and distributed training setup with proper checkpointing
- ●Analysis framework with parameter importance plots, optimization history visualization, and statistical significance testing of results
Not designed for ↓
- ×Model architecture design or feature engineering decisions
- ×Data preprocessing or cleaning workflows
- ×Model interpretation or explainability analysis
- ×Production model deployment and monitoring
Model type, dataset characteristics, evaluation metric, computational budget, and current baseline performance to design optimal tuning strategy.
Complete hyperparameter optimization setup with search space definition, optimization strategy selection, cross-validation configuration, and result analysis framework.
Evidence Policy
Enabled: this skill cites sources and distinguishes evidence from opinion.
Research Foundation: 7 sources (3 academic, 1 industry frameworks, 2 official docs, 1 books)
This skill was developed through independent research and synthesis. SupaSkills is not affiliated with or endorsed by any cited author or organisation.
Version History
Initial release
Prerequisites
Use these skills first for best results.
Works well with
Need more depth?
Specialist skills that go deeper in areas this skill touches.
Common Workflows
Complete Model Development Pipeline
End-to-end workflow from feature engineering through hyperparameter optimization to model evaluation and deployment
Activate this skill in Claude Code
Sign up for free to access the full system prompt via REST API or MCP.
Start Free to Activate This Skill© 2026 Kill The Dragon GmbH. This skill and its system prompt are protected by copyright. Unauthorised redistribution is prohibited. Terms of Service · Legal Notice