Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Feature Construction and Selection for PV Solar Power Modeling (2202.06226v1)

Published 13 Feb 2022 in eess.SY, cs.LG, and cs.SY

Abstract: Using solar power in the process industry can reduce greenhouse gas emissions and make the production process more sustainable. However, the intermittent nature of solar power renders its usage challenging. Building a model to predict photovoltaic (PV) power generation allows decision-makers to hedge energy shortages and further design proper operations. The solar power output is time-series data dependent on many factors, such as irradiance and weather. A machine learning framework for 1-hour ahead solar power prediction is developed in this paper based on the historical data. Our method extends the input dataset into higher dimensional Chebyshev polynomial space. Then, a feature selection scheme is developed with constrained linear regression to construct the predictor for different weather types. Several tests show that the proposed approach yields lower mean squared error than classical machine learning methods, such as support vector machine (SVM), random forest (RF), and gradient boosting decision tree (GBDT).

Citations (4)

Summary

  • The paper proposes a machine learning model that constructs rich feature sets and selects optimal predictors for varying weather conditions.
  • It employs Chebyshev polynomial expansion and trigonometric encoding to capture non-linear and cyclical patterns in PV output.
  • The constrained linear regression, compared against SVR, RF, and GBDT, significantly improves 1-hour ahead solar power forecasts.

This paper presents a machine learning framework for predicting 1-hour ahead photovoltaic (PV) solar power generation using historical weather data and past power output. The core idea is to improve prediction accuracy by constructing a rich set of features derived from basic inputs and then selecting the most relevant ones for different weather conditions using a constrained linear regression model.

Problem: Accurate short-term solar power prediction is crucial for grid stability and energy management due to the intermittent nature of solar energy. Existing methods often struggle with varying weather conditions.

Proposed Method:

  1. Data Preparation:
    • Inputs: Temperature, dew point, humidity, wind speed, time of day, and the solar power output from the previous 15-minute interval (y(k)y(k)).
    • Data Source: Weather data from Long Beach, CA (Jan-Jun 2014) and 15-minute solar power data from the California Solar Initiative (CSI) for a PV system at Long Beach airport.
    • Preprocessing: The data is aligned by timestamp. The mean daily power profile (yˉ(k)\bar{y}(k)) from the training set is calculated and subtracted from the actual power output to predict the deviation (y(k)=y(k)yˉ(k)y'(k) = y(k) - \bar{y}(k)). This detrending helps focus the model on predicting variations from the average pattern.
    • Weather Types: Data is categorized into three weather types: cloudy (includes mostly/partly cloudy), fair, and haze. Separate models are trained for each type.
  2. Feature Construction:
    • Basic Regressor (ϕ\phi): Initial features include the detrended past power output (y(k)y'(k)), weather variables (u1\bm{u}_1 to u4\bm{u}_4), and time (u5\bm{u}_5).
    • Extended Regressor (ϕ\phi^*): Time is further encoded using trigonometric functions (u6=cos(πu5/24)\bm{u}_6 = \cos(\pi \bm{u}_5/24), u7=sin(πu5/24)\bm{u}_7 = \sin(\pi \bm{u}_5/24)) to capture cyclical daily patterns.
    • Normalization (ϕ~\tilde{\phi}): The extended features are normalized to the range [1,1][-1, 1].
    • Chebyshev Polynomials: Features are expanded into a higher-dimensional space using the first 11 Chebyshev polynomials (C0C_0 to C10C_{10}) applied to each normalized feature in ϕ~\tilde{\phi}. This creates non-linear transformations of the original inputs.
    • Interaction Features (C11C_{11} to C13C_{13}): To account for the varying impact of clouds at different times of day, interaction terms are created by multiplying one-hot encoded cloud types (u8,u9,u10\bm{u}_8, \bm{u}_9, \bm{u}_{10} for cloudy, mostly cloudy, partly cloudy) with the normalized time feature (u5/24\bm{u}_5/24).
  3. Feature Selection and Model Training:
    • Wrapper Method: A sequential forward selection and backward elimination approach (Algorithm 1) is used to select the best subset of constructed features (Ψ[i]\Psi_{[i]}) for each weather type (ii). The selection criterion is the Mean Squared Error (MSE) on a dedicated validation dataset.
    • Constrained Linear Regression: For a chosen feature subset Ψ[i]\Psi_{[i]}, the model coefficients (a[i]\bm{a}_{[i]}) are determined by solving a constrained least squares problem:

      mina[i]kΓ[i](y(k+1)Cw,jΨ[i]a[i],w,jCw,j)2\min_{\bm{a}_{[i]}} \sum_{k\in\Gamma_{[i]}} (y'(k+1) - \sum_{\bm{C}_{w,j}\in\Psi_{[i]}} \bm{a}_{[i],w,j} \bm{C}_{w,j})^2

      subject to the physical constraint that the predicted power must be non-negative:

      Cw,jΨ[i]a[i],w,jCw,j+yˉ(k+1)0\sum_{\bm{C}_{w,j}\in\Psi_{[i]}} \bm{a}_{[i],w,j} \bm{C}_{w,j} + \bar{y}(k+1) \ge 0

* Multi-step Refinement (Algorithm 2): The feature selection process is iteratively refined by evaluating the combined multi-step (1 to 4 steps ahead) prediction error on the validation set across all weather types. * Boundedness: A constraint a[i],w,j1\sum |a_{[i],w,j}| \le 1 can be added to the least squares problem to ensure the predicted deviation y^\hat{y}' remains bounded, especially for multi-step predictions, provided the inputs are appropriately scaled.

Implementation Details:

  • The model predicts the deviation y(k+1)y'(k+1) for 1-hour ahead (which corresponds to 4 steps of 15-minute intervals).
  • The final prediction is y^(k+1)=y^(k+1)+yˉ(k+1)\hat{y}(k+1) = \hat{y}'(k+1) + \bar{y}(k+1).
  • The constrained least squares is a convex optimization problem, solvable efficiently.
  • Comparison models (SVR, RF, LightGBM) were implemented using scikit-learn and LightGBM libraries. Hyperparameters for these models were tuned using GridSearchCV (for SVR) and FLAML (for LightGBM) on the validation set.
  • Evaluation uses the combined MSE over 1-to-4 step ahead predictions on unseen test datasets.

Results:

  • The proposed method, combining Chebyshev feature expansion and constrained regression with feature selection, generally outperformed standard SVR, RF, and GBDT models on the test datasets evaluated (Table III).
  • Feature selection identified different subsets of features as optimal for different weather types and time periods (datasets). High-order polynomials and interaction terms were particularly useful for cloudy conditions.
  • Predictions were more accurate for fair and haze conditions compared to cloudy days, where solar power exhibits higher variability.
  • The results suggest that domain-specific feature engineering and selection can lead to better performance than relying solely on complex model architectures with raw features.

Practical Applications:

This approach provides a concrete method for improving short-term solar power forecasts. Implementing it involves:

  1. Gathering relevant weather data (temperature, dew point, humidity, wind speed, cloud type/description) and historical PV power output at a suitable frequency (e.g., 15 minutes).
  2. Implementing the data preprocessing steps: time alignment, calculating and subtracting the mean daily profile.
  3. Coding the feature construction pipeline: adding time-based trigonometric features, normalizing, applying Chebyshev polynomials, and creating cloud-time interaction terms.
  4. Setting up a training/validation/testing split strategy appropriate for time-series data (e.g., rolling windows as used in the paper).
  5. Implementing the feature selection algorithm (Algorithm 1 & 2) coupled with a constrained least squares solver. Many optimization libraries (like scipy.optimize in Python) can handle constrained quadratic programming.
  6. Deploying the selected models (one for each weather type), switching between them based on the current weather forecast or observation.

The method offers a balance between model simplicity (linear regression at its core) and representational power (through high-dimensional feature engineering), achieving strong predictive performance.