data science & ml Claude 스킬을 발견하세요. 61개의 스킬을 탐색하고 AI 워크플로우에 완벽한 기능을 찾아보세요.
Automates Life Sciences R&D workflows by integrating Benchling registry, inventory, and notebook operations via Python SDK and REST API.
Automates the end-to-end MLOps lifecycle from data preparation and model training to production deployment and monitoring.
Optimizes Jupyter notebook workflows by preventing unnecessary kernel restarts when using the IPython autoreload extension.
Generates professional, data-driven presentations and technical whitepapers with robust citation management and reproducibility standards.
Refactors Jupyter notebook code into reusable Python modules while preserving critical variable definitions and configuration.
Generates professional terminal-based and image-based visualizations to communicate data patterns and analytics results clearly.
Implements a 7-action space with integrated position sizing and small account simulation for reinforcement learning trading models.
Implements a systematic data quality remediation process to detect duplicates, handle outliers, and standardize inconsistencies for reliable analysis.
Ensures quantitative accuracy in microscopy deconvolution by preserving original intensity relationships across image channels.
Automates the archival and quality classification of algorithmic trading models based on performance metrics and risk thresholds.
Corrects file path errors and API key configurations in Google Colab environments after repository extraction.
Optimizes PPO neural network dimensions to balance trading model capacity, inference speed, and hardware memory usage.
Integrates multi-agent Claude systems into algorithmic trading pipelines to optimize model training and manage live risk with automated oversight.
Optimizes financial data retrieval by caching market symbol data to reduce API latency and avoid rate limits during trading bot development.
Optimizes the visualization of sparse single-cell gene expression data by implementing alternative plotting patterns that prevent boxplot collapse.
Implements standardized risk management and drawdown protection patterns for algorithmic trading systems.
Persists trading backtest results to an SQLite database to enable historical performance comparison and model optimization.
Optimizes CODEX and scRNAseq data integration by incorporating tissue heterogeneity into spatial cell matching workflows.
Standardizes the generation and PDF archival of Quality Control (QC) heatmaps and profiles for microscopy quantification workflows.
Predicts market orderbook depth and bid-ask spreads to optimize trade execution timing using Nixtla TimeGPT and historical market data.
Implements percentage-based capital allocation and risk limits to ensure trading strategies scale effectively across all account sizes.
Optimizes reinforcement learning training stability and monitoring through learning rate warmups, validation scheduling, and reward weight tuning.
Optimizes financial asset selection by applying differentiated filtering logic for equities and cryptocurrencies within trading systems.
Derives trading selection thresholds from market data to replace hardcoded parameters with dynamic, data-driven values.
Identifies and fixes hardcoded values in Jupyter notebooks that conflict with defined configuration constants.
Generates professional, locally-executed PDF reports featuring formatted text, data tables, and embedded visualizations using the reportlab library.
Optimizes LLM performance through advanced prompt engineering, RAG architecture design, and agentic system orchestration.
Streamlines the fine-tuning of Nixtla TimeGPT models on custom datasets for high-precision, domain-specific forecasting.
Interacts with diverse large language models through a command-line interface to perform tasks like prompt execution, data extraction, and embedding management.
Analyzes and visualizes high-throughput sequencing data including ChIP-seq, RNA-seq, and ATAC-seq.
Scroll for more results...