ANOVA (analysis of variance table)

ARIMA Model

Augmented Dickey-Fuller (ADF)

Unit Root Test

Autoregression [Citation not found]

Autoregressive Model (AR) [1]

In statistics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model specifies that the output variable depends linearly on its own previous values and on a stochastic term (an imperfectly predictable term); thus the model is in the form of a stochastic difference equation.

A common approach for modeling univariate time series is the autoregressive (AR) model: where is the time series, is white noise, and

,

with denoting the process mean.

An autoregressive model is simply a linear regression of the current value of the series against one or more prior values of the series. The value of is called the order of the AR model.

AR models can be analyzed with one of various methods, including standard linear least squares techniques. They also have a straightforward interpretation.

Autocorrelation [3]

Also known as serial correlation, is the correlation of a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between observations as a function of the time lag between them.

The analysis of autocorrelation is a mathematical tool for finding repeating patterns, such as the presence of a periodic signal obscured by noise, or identifying the missing fundamental frequency in a signal implied by its harmonic frequencies.

Autocorrelation Function (ACF)

Autocovariance [24]

Box-Jenkins Forecasting Method

The univariate version of this methodology is a self projecting time series forecasting method. The underlying goal is to find an appropriate formula so that the residuals are as small as possible and exhibit no pattern. The model building process involves a few steps, repeated as necessary, to end up with a specific formula that replicates the patterns in the series as closely as possible and also produces accurate forecasts.

Central Limit Theorem (CLT)

Correlation [Citation not found]

Covariance [12]

In probability theory and statistics, covariance is a measure of the joint variability of two random variables.

If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the lesser values, i.e., the variables tend to show similar behavior, the covariance is positive.

In the opposite case, when the greater values of one variable mainly correspond to the lesser values of the other, i.e., the variables tend to show opposite behavior, the covariance is negative.

The sign of the covariance therefore shows the tendency in the linear relationship between the variables. The magnitude of the covariance is not easy to interpret. The normalized version of the covariance, the correlation coefficient, however, shows by its magnitude the strength of the linear relation.

Correlation Coefficient [5]

In statistics, the Pearson correlation coefficient (PCC, pronounced /ˈpɪərsən/), also referred to as the Pearson's r or Pearson product-moment correlation coefficient (PPMCC), is a measure of the linear dependence (correlation) between two variables X and Y. It has a value between +1 and −1 inclusive, where 1 is total positive linear correlation, 0 is no linear correlation, and −1 is total negative linear correlation.

Cross-Correlation

Cross-Covariance Function (CCF)

Cyclic Pattern

Exists when data exhibit rises and falls that are not of fixed period (duration usually of at least 2 years).

Data Generating Process (DGP)

Detrending

Decomposition Analysis

It is the pattern generated by the time series and not necessarily the individual data values that offers to the manager who is an observer, a planner, or a controller of the system. Therefore, the Decomposition Analysis is used to identify several patterns that appear simultaneously in a time series.

Deseasonalizing

Deseasonalizing the data, also called Seasonal Adjustment is the process of removing recurrent and periodic variations over a short time frame, e.g., weeks, quarters, months. Therefore, seasonal variations are regularly repeating movements in series values that can be tied to recurring events. The Deseasonalized data is obtained by simply dividing each time series observation by the corresponding seasonal index.

Almost all time series published by the US government are already deseasonalized using the seasonal index to unmasking the underlying trends in the data, which could have been caused by the seasonality factor.

Differencing

The th differencing operator applied to a time series is to create a new series whose value at time is the difference between and . This method works very well in removing trends and cycles. For example, first differencing applied to a series with a linear trend eliminates the trend while if cycles of length d exist in a series, a th difference will remove them.

Difference-Stationary

Econometrics

Estimator

Estimate

Estimand

Ergodicity

First Order Stationary

A time series is a first order stationary if expected value of remains the same for all .

For example in economic time series, a process is first order stationary when we remove any kinds of trend by some mechanisms such as differencing.

Forecasting

Incorporating seasonality in a forecast is useful when the time series has both trend and seasonal components. The final step in the forecast is to use the seasonal index to adjust the trend projection. One simple way to forecast using a seasonal adjustment is to use a seasonal factor in combination with an appropriate underlying trend of total value of cycles.

Frequency-Domain

Gaussian White Noise

wherein the are independent normal random variables, with mean and variance ; or more succinctly,

Gaussian

Another term for normal distribution google

iid

independent and identically distributed () random variables with mean and variance .

Kwiatkowski-Phillips-Schmidt-Shin (KPSS) test

Law of Large Numbers (LLN)

Linear

Mean Reverting

Model

An external and explicit representation of a part of reality, as it is seen by individuals who wish to use this model to understand, change, manage and control that part of reality.

Moments

Moving-Average Model (MA)

In time series analysis, the Moving-Average (MA) Model is a common approach for modeling univariate time series. The moving-average model specifies that the output variable depends linearly on the current and various past values of a stochastic (imperfectly predictable) term. wikipedia

Multivariate

Non-Parametric

Non-Linear

Parametric

Periodicity

Purely Random Process

is a stochastic process, where each element is (statistically) independent of every other element, for , and each element has an identical distribution.

Quartile

Random Walk

Regression [Citation not found]

Residuals

Residual Decompositions

Seasonality [14]

In time series data, seasonality is the presence of variations that occur at specific regular intervals less than a year, such as weekly, monthly, or quarterly. Seasonality may be caused by various factors, such as weather, vacation, and holidays[1] and consists of periodic, repetitive, and generally regular and predictable patterns in the levels[2] of a time series.

Seasonal Adjustment [15]

Seasonal adjustment is a statistical method for removing the seasonal component of a time series that exhibits a seasonal pattern. It is usually done when wanting to analyze the trend, and cyclical deviations from trend, of a time series independently of the seasonal components. It is normal to report seasonally adjusted data for unemployment rates to reveal the underlying trends and cycles in labor markets.

Many economic phenomena have seasonal cycles, such as agricultural production and consumer consumption, e.g. greater consumption leading up to Christmas. It is necessary to adjust for this component in order to understand what underlying trends are in the economy and so official statistics are often adjusted to remove seasonal components.

Seasonal Pattern

Exists when a series is influenced by seasonal factors (e.g., the quarter of the year, the month, or day of the week).

Second Order Stationary

A time series is a second order stationary if it is first order stationary and covariance between and is function of length only.

Again, in economic time series, a process is second order stationary when we stabilize also its variance by some kind of transformations, such as taking square root.

Serial Correlation

see Autocorrelation

Stochastic

Randomly determined; having a random probability distribution or pattern that may be analyzed statistically but may not be predicted precisely. google

Stationary Time Series [13]

Stationarity has always played a major role in time series analysis. To perform forecasting, most techniques required stationarity conditions. Therefore, we need to establish some conditions, e.g. time series must be a first and second order stationary process.

A stationary time series is one whose statistical properties such as mean, variance, autocorrelation, etc. are all constant over time. Stationarity and differencing

Stationarity

A stationary process has the property that the mean, variance and autocorrelation structure do not change over time.

Strictly Stationarity

Time Domain [10]

Time Series [8]

A time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Examples of time series are heights of ocean tides, counts of sunspots, and the daily closing value of the Dow Jones Industrial Average.

Time series are very frequently plotted via line charts. Time series are used in statistics, signal processing, pattern recognition, econometrics, mathematical finance, weather forecasting, intelligent transport and trajectory forecasting, earthquake prediction, electroencephalography, control engineering, astronomy, communications engineering, and largely in any domain of applied science and engineering which involves temporal measurements.

Time series analysis [8]

Comprises methods for analyzing time series data in order to extract meaningful statistics and other characteristics of the data. Time series forecasting is the use of a model to predict future values based on previously observed values. While regression analysis is often employed in such a way as to test theories that the current values of one or more independent time series affect the current value of another time series, this type of analysis of time series is not called "time series analysis", which focuses on comparing values of a single time series or multiple dependent time series at different points in time.

Trend

Trend Analysis

Uses linear and nonlinear regression with time as the explanatory variable, it is used where pattern over time have a long-term trend. Unlike most time-series forecasting techniques, the Trend Analysis does not assume the condition of equally spaced time series.

Trend Pattern

Exists when there is a long-term increase or decrease in the data.

Univariate

The term "univariate time series" refers to a time series that consists of single (scalar) observations recorded sequentially over equal time increments.

Variance

Vector Autoregressions (VARs)

is an econometric model used to capture the linear interdependencies among multiple time series. VAR models generalize the Univariate Autoregressive Model (AR model) by allowing for more than one evolving variable. All variables in a VAR enter the model in the same way: each variable has an equation explaining its evolution based on its own lagged values, the lagged values of the other model variables, and an error term.

VAR modeling does not require as much knowledge about the forces influencing a variable as do structural models with simultaneous equations: The only prior knowledge required is a list of variables which can be hypothesized to affect each other intertemporally. wikipedia

Weak Stationarity [25]

If is a weakly stationary process, then the following are true:

, for all ,

and

, where is the lag time, or the amount of time by which the signal has been shifted.

White Noise

A simple kind of generated series might be a collection of uncorrelated random variables, , with mean and finite variance .

The time series is white or independent noise if the sequence of random variables is independent and identically distributed (iid).

Wide-Sense Stationarity

results matching ""

    No results matching ""