.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "_examples\forecasting\plot_wls_basic.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr__examples_forecasting_plot_wls_basic.py: WLS - Basic ------------------- .. GENERATED FROM PYTHON SOURCE LINES 6-119 .. image-sg:: /_examples/forecasting/images/sphx_glr_plot_wls_basic_001.png :alt: plot wls basic :srcset: /_examples/forecasting/images/sphx_glr_plot_wls_basic_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none c:\users\kelda\desktop\repositories\virtualenvs\venv-py391-pyamr\lib\site-packages\statsmodels\regression\linear_model.py:807: RuntimeWarning: divide by zero encountered in log Series: wls-rsquared 0.4773 wls-rsquared_adj 0.472 wls-fvalue 89.4894 wls-fprob 0.0 wls-aic inf wls-bic inf wls-llf -inf wls-mse_model 137903.9406 wls-mse_resid 1541.008 wls-mse_total 2918.4114 wls-const_coef 275.6131 wls-const_std 18.2135 wls-const_tvalue 15.1324 wls-const_tprob 0.0 wls-const_cil 239.469 wls-const_ciu 311.7571 wls-x1_coef 2.405 wls-x1_std 0.2542 wls-x1_tvalue 9.4599 wls-x1_tprob 0.0 wls-x1_cil 1.9005 wls-x1_ciu 2.9095 wls-s_dw 0.488 wls-s_jb_value 7.949 wls-s_jb_prob 0.0188 wls-s_skew 0.566 wls-s_kurtosis 3.791 wls-s_omnibus_value 8.126 wls-s_omnibus_prob 0.017 wls-m_dw 0.1295 wls-m_jb_value 6.1737 wls-m_jb_prob 0.0456 wls-m_skew -0.5865 wls-m_kurtosis 3.325 wls-m_nm_value 6.6625 wls-m_nm_prob 0.0357 wls-m_ks_value 0.5736 wls-m_ks_prob 0.0 wls-m_shp_value 0.9415 wls-m_shp_prob 0.0002 wls-m_ad_value 2.4991 wls-m_ad_nnorm False wls-missing raise wls-exog [[1.0, 0.0... wls-endog [41.719565... wls-trend c wls-weights [0.0435648... wls-W |t| [0.025 0.975] ------------------------------------------------------------------------------ const 275.6131 18.213 15.132 0.000 239.469 311.757 x1 2.4050 0.254 9.460 0.000 1.900 2.910 ============================================================================== Omnibus: 8.126 Durbin-Watson: 0.488 Prob(Omnibus): 0.017 Jarque-Bera (JB): 7.949 Skew: 0.566 Prob(JB): 0.0188 Kurtosis: 3.791 Cond. No. 234. Normal (N): 6.662 Prob(N): 0.036 ============================================================================== | .. code-block:: default :lineno-start: 6 # Import class. import sys import numpy as np import pandas as pd import matplotlib as mpl import matplotlib.pyplot as plt import statsmodels.api as sm import statsmodels.robust.norms as norms # import weights. from pyamr.datasets.load import make_timeseries from pyamr.core.regression.wls import WLSWrapper from pyamr.metrics.weights import SigmoidA # ---------------------------- # set basic configuration # ---------------------------- # Matplotlib options mpl.rc('legend', fontsize=6) mpl.rc('xtick', labelsize=6) mpl.rc('ytick', labelsize=6) # Set pandas configuration. pd.set_option('display.max_colwidth', 14) pd.set_option('display.width', 150) pd.set_option('display.precision', 4) # ---------------------------- # create data # ---------------------------- # Create timeseries data x, y, f = make_timeseries() # Create method to compute weights from frequencies W = SigmoidA(r=200, g=0.5, offset=0.0, scale=1.0) # Note that the function fit will call M.weights(weights) inside and will # store the M converter in the instance. Therefore, the code execute is # equivalent to with the only difference being that # the weight converter is not saved. wls = WLSWrapper(estimator=sm.WLS).fit( \ exog=x, endog=y, trend='c', weights=f, W=W, missing='raise') # Print series. print("\nSeries:") print(wls.as_series()) # Print regression line. print("\nRegression line:") print(wls.line(np.arange(10))) # Print summary. print("\nSummary:") print(wls.as_summary()) # ----------------- # Save & Load # ----------------- # File location #fname = '../../examples/saved/wls-sample.pickle' # Save #wls.save(fname=fname) # Load #wls = WLSWrapper().load(fname=fname) # ------------- # Example I # ------------- # This example shows how to make predictions using the wrapper and how # to plot the resultin data. In addition, it compares the intervales # provided by get_prediction (confidence intervals) and the intervals # provided by wls_prediction_std (prediction intervals). # # To Do: Implement methods to compute CI and PI (see regression). # Variables. start, end = None, 180 # Compute predictions (exogenous?). It returns a 2D array # where the rows contain the time (t), the mean, the lower # and upper confidence (or prediction?) interval. preds = wls.get_prediction(start=start, end=end) # Create figure fig, ax = plt.subplots(1, 1, figsize=(11,5)) # Plotting confidence intervals # ----------------------------- # Plot truth values. ax.plot(x, y, color='#A6CEE3', alpha=0.5, marker='o', markeredgecolor='k', markeredgewidth=0.5, markersize=5, linewidth=0.75, label='Observed') # Plot forecasted values. ax.plot(preds[0,:], preds[1, :], color='#FF0000', alpha=1.00, linewidth=2.0, label=wls._identifier(short=True)) # Plot the confidence intervals. ax.fill_between(preds[0, :], preds[2, :], preds[3, :], color='r', alpha=0.1) # Legend plt.legend() # Show plt.show() .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.114 seconds) .. _sphx_glr_download__examples_forecasting_plot_wls_basic.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_wls_basic.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_wls_basic.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_