Please wait a minute...
吉林化工学院学报, 2019, 36(1): 90-93     https://doi.org/10.16039/j.cnki.cn22-1249.2019.01.020
  本期目录 | 过刊浏览 | 高级检索 |
基于二项稀疏算子的整值自回归模型的经验似然推断
张庆春,李晓梅,范晓东
吉林化工学院 理学院
Empirical Likelihood Inference for Integer Value Auto-regressive Model Based on Binomial Thinning Operator
ZHANG Qing-chun, LI Xiao-mei, FAN Xiao-dong
下载:  PDF (391KB) 
输出:  BibTeX | EndNote (RIS)      
摘要 

整值时间序列可出现在交通、金融、教育、环境、保险等很多领域。稀疏算子是研究整值时间序列的主要方法。本文采用经验似然法研究基于二项稀疏算子的一阶整值自回归模型,并给出新息项为泊松的BINAR(1)模型的经验似然推断及其最大经验似然的估计值。利用数值模拟来研究经验似然估计的表现。最后通过一个犯罪数据的实例给出模型的应用。

服务
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章
张庆春
李晓梅
范晓东
关键词:  二项稀疏算子  INAR(1)模型  新息项  经验似然     
Abstract: 

Integer value time series appear in many fields such as transportation, finance, education, environment, and insurance. The thinning operator is the main method to study the integer value time series.In this paper, the first-order integer value autoregressive model based on binomial thinning operator is studied by the empirical likelihood method, and the empirical likelihood inference of the BINAR(1) model with the Poisson innovation and its empirical likelihood estimation are given. The performance of empirical likelihood estimation is studied by means of numerical simulation. Finally, the application of the model is given through an example of crime data.

Key words:  binomial thinning operator    INAR(1) model    innovation    empirical likelihood method
               出版日期:  2019-01-25      发布日期:  2019-01-25      整期出版日期:  2019-01-25
ZTFLH:  O212.1  
引用本文:    
张庆春, 李晓梅, 范晓东. 基于二项稀疏算子的整值自回归模型的经验似然推断 [J]. 吉林化工学院学报, 2019, 36(1): 90-93.
ZHANG Qing-chun, LI Xiao-mei, FAN Xiao-dong. Empirical Likelihood Inference for Integer Value Auto-regressive Model Based on Binomial Thinning Operator . Journal of Jilin Institute of Chemical Technology, 2019, 36(1): 90-93.
链接本文:  
http://xuebao.jlict.edu.cn/CN/10.16039/j.cnki.cn22-1249.2019.01.020  或          http://xuebao.jlict.edu.cn/CN/Y2019/V36/I1/90
No related articles found!
No Suggested Reading articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed