Agent SkillsAgent Skills
vasilyu1983

ai-ml-timeseries

@vasilyu1983/ai-ml-timeseries
vasilyu1983
53
12 forks
Updated 4/6/2026
View on GitHub

Operational patterns, templates, and decision rules for time series forecasting (modern best practices): tree-based methods (LightGBM), deep learning (Transformers, RNNs), future-guided learning, temporal validation, feature engineering, generative TS (Chronos), and production deployment. Emphasizes explainability, long-term dependency handling, and adaptive forecasting.

Installation

$npx agent-skills-cli install @vasilyu1983/ai-ml-timeseries
Claude Code
Cursor
Copilot
Codex
Antigravity

Details

Pathframeworks/claude-code-kit/framework/skills/ai-ml-timeseries/SKILL.md
Branchmain
Scoped Name@vasilyu1983/ai-ml-timeseries

Usage

After installing, this skill will be available to your AI coding assistant.

Verify installation:

npx agent-skills-cli list

Skill Instructions


name: ai-ml-timeseries description: > Operational patterns, templates, and decision rules for time series forecasting (modern best practices): tree-based methods (LightGBM), deep learning (Transformers, RNNs), future-guided learning, temporal validation, feature engineering, generative TS (Chronos), and production deployment. Emphasizes explainability, long-term dependency handling, and adaptive forecasting.

Time Series Forecasting — Modern Patterns & Production Best Practices

Modern Best Practices (2024-2025):

  • Tree-based methods (LightGBM) deliver best performance + efficiency
  • Transformers excel at long-term dependencies but watch for distribution shifts
  • Future-Guided Learning: 44.8% AUC-ROC improvement in event forecasting
  • Explainability critical in healthcare/finance (use LightGBM + SHAP)

This skill provides operational, copy-paste-ready workflows for forecasting with recent advances: TS-specific EDA, temporal validation, lag/rolling features, model selection, multi-step forecasting, backtesting, generative AI (Chronos, TimesFM), and production deployment with drift monitoring.

It focuses on hands-on forecasting execution, not theory.


When to Use This Skill

Claude should invoke this skill when the user asks for hands-on time series forecasting, e.g.:

  • "Build a time series model for X."
  • "Create lag features / rolling windows."
  • "Help design a forecasting backtest."
  • "Pick the right forecasting model for my data."
  • "Fix leakage in forecasting."
  • "Evaluate multi-horizon forecasts."
  • "Use LLMs or generative models for TS."
  • "Set up monitoring for a forecast system."
  • "Implement LightGBM for time series."
  • "Use transformer models (TimesFM, Chronos) for forecasting."
  • "Apply Future-Guided Learning for event prediction."

If the user is asking about general ML modelling, deployment, or infrastructure, prefer:

  • ai-ml-data-science - General data science workflows, EDA, feature engineering, evaluation
  • ai-mlops - Model deployment, monitoring, drift detection, retraining automation
  • ai-mlops - Security, privacy, governance for ML systems

If the user is asking about LLM/RAG/search, prefer:

  • ai-llm - LLM fine-tuning, prompting, evaluation
  • ai-rag - RAG pipeline design and optimization
  • ai-rag - Search and retrieval systems

Quick Reference

TaskTool/FrameworkCommandWhen to Use
TS EDA & DecompositionPandas, statsmodelsseasonal_decompose(), df.plot()Identifying trend, seasonality, outliers
Lag/Rolling FeaturesPandas, NumPydf.shift(), df.rolling()Creating temporal features for ML models
Model Training (Tree-based)LightGBM, XGBoostlgb.train(), xgb.train()Tabular TS with seasonality, covariates
Deep Learning (Transformers)TimesFM, Chronosmodel.forecast()Long-term dependencies, complex patterns
Future-Guided LearningCustom RNN/TransformerFeedback-based trainingEvent forecasting (44.8% AUC-ROC improvement)
BacktestingCustom rolling windowsfor window in windows: train(), test()Temporal validation without leakage
Metrics Evaluationscikit-learn, custommean_absolute_error(), MAPE, MASEMulti-horizon forecast accuracy
Production DeploymentMLflow, AirflowScheduled pipelinesAutomated retraining, drift monitoring

Decision Tree: Choosing Time Series Approach

User needs time series forecasting for: [Data Type]
    ├─ Strong Seasonality?
    │   ├─ Simple patterns? → LightGBM with seasonal features
    │   ├─ Complex patterns? → LightGBM + Prophet comparison
    │   └─ Multiple seasonalities? → Prophet or TBATS
    │
    ├─ Long-term Dependencies (>50 steps)?
    │   ├─ Transformers (TimesFM, Chronos) → Best for complex patterns
    │   └─ RNNs/LSTMs → Good for sequential dependencies
    │
    ├─ Event Forecasting (binary outcomes)?
    │   └─ Future-Guided Learning → 44.8% AUC-ROC improvement
    │
    ├─ Intermittent/Sparse Data (many zeros)?
    │   ├─ Croston/SBA → Classical intermittent methods
    │   └─ LightGBM with zero-inflation features → Modern approach
    │
    ├─ Multiple Covariates?
    │   ├─ LightGBM → Best with many features
    │   └─ TFT/DeepAR → If deep learning needed
    │
    └─ Explainability Required (healthcare, finance)?
        ├─ LightGBM → SHAP values, feature importance
        └─ Linear models → Most interpretable

Navigation: Core Patterns

Time Series EDA & Data Preparation

  • TS EDA Best Practices
    • Frequency detection, missing timestamps, decomposition
    • Outlier detection, level shifts, seasonality analysis
    • Granularity selection and stability checks

Feature Engineering

  • Lag & Rolling Patterns
    • Lag features (lag_1, lag_7, lag_28 for daily data)
    • Rolling windows (mean, std, min, max, EWM)
    • Avoiding leakage, seasonal lags, datetime features

Model Selection

  • Model Selection Guide

    • Decision rules: Strong seasonality → LightGBM, Long-term → Transformers
    • Benchmark comparison: LightGBM vs Prophet vs Transformers vs RNNs
    • Explainability considerations for mission-critical domains
  • LightGBM TS Patterns (2024-2025 best practices)

    • Why LightGBM excels: performance + efficiency + explainability
    • Feature engineering for tree-based models
    • Hyperparameter tuning for time series

Forecasting Strategies

  • Multi-Step Forecasting Patterns

    • Direct strategy (separate models per horizon)
    • Recursive strategy (feed predictions back)
    • Seq2Seq strategy (Transformers, RNNs for long horizons)
  • Intermittent Demand Patterns

    • Croston, SBA, ADIDA for sparse data
    • LightGBM with zero-inflation features (modern approach)
    • Two-stage hurdle models, hierarchical Bayesian

Validation & Evaluation

  • Backtesting Patterns
    • Rolling window backtest, expanding window
    • Temporal train/validation split (no IID splits!)
    • Horizon-wise metrics, segment-level evaluation

Generative & Advanced Models

  • TS-LLM Patterns
    • Chronos, TimesFM, Lag-Llama (Transformer models)
    • Future-Guided Learning (44.8% AUC-ROC boost for events)
    • Tokenization, discretization, trajectory sampling

Production Deployment

  • Production Deployment Patterns
    • Feature pipelines (same code for train/serve)
    • Retraining strategies (time-based, drift-triggered)
    • Monitoring (error drift, feature drift, volume drift)
    • Fallback strategies, streaming ingestion, data governance

Navigation: Templates (Copy-Paste Ready)

Data Preparation

Feature Templates

Model Templates

Evaluation Templates

Advanced Templates

  • TS-LLM Template - Chronos, TimesFM, Future-Guided Learning implementation

Related Skills

For adjacent topics, reference these skills:

  • ai-ml-data-science - EDA workflows, feature engineering patterns, model evaluation, SQLMesh transformations
  • ai-mlops - Production deployment, automated monitoring (18-second drift detection), retraining pipelines
  • ai-llm - Fine-tuning approaches applicable to time series LLMs (Chronos, TimesFM)
  • ai-prompt-engineering - Prompt design patterns for time series LLMs
  • data-sql-optimization - SQL optimization for time series data storage and retrieval

External Resources

See data/sources.json for curated web resources including:

  • Classical methods (statsmodels, Prophet, ARIMA)
  • Deep learning frameworks (PyTorch Forecasting, GluonTS, Darts, NeuralProphet)
  • Transformer models (TimesFM, Chronos, Lag-Llama, Informer, Autoformer)
  • Anomaly detection tools (PyOD, STUMPY, Isolation Forest)
  • Feature engineering libraries (tsfresh, TSFuse, Featuretools)
  • Production deployment (Kats, MLflow, sktime)
  • Benchmarks and datasets (M5 Competition, Monash Time Series, UCI)

Usage Notes

For Claude:

  • Activate this skill for hands-on forecasting tasks, feature engineering, backtesting, or production setup
  • Start with Quick Reference and Decision Tree for fast guidance
  • Drill into resources/ for detailed implementation patterns
  • Use templates/ for copy-paste ready code
  • Always check for temporal leakage (future data in training)
  • Prefer LightGBM for most use cases unless long-term dependencies require Transformers
  • Emphasize explainability for healthcare/finance domains
  • Monitor for data distribution shifts in production

Key Principle: Time series forecasting is about temporal structure, not IID assumptions. Use temporal validation, avoid future leakage, and choose models based on horizon length and data characteristics.