Alecos Papadopoulos | 1 Mar 18:15 2015
Picon

Re : function "ranking"

I run Gretl 1.10.0cvs build date: 05 Feb 2015 for Windows 64.

I need to use the  function "ranking" on column vectors of numbers where 
there will always be only two equal numbers, with all others distinct. 
The one of the two equal numbers comes from a different calculation 
path. The calculation of all these numbers involves natural logarithms.

The problem  is that in order for the "ranking" function to recognize 
the equality, and provide half-integer ranks as programmed, the two 
numbers must be identical to whatever precision they are stored in Gretl.
In practice these two numbers are not totally identical. For example I 
obtain
2.8841797876482436
and
2.8841797876482445

But the difference is an artifact of calculation -the two magnitudes 
that are represented by these numbers are identical. The difference is 
immaterial for all practical purposes -but the function "ranking", 
performing as it is supposed to perform, gives rank 12 to the first and 
13 for the second. But what I "should" get (so as to be consistent with 
the theoretical quantities) is 12.5 and 12.5 . Moreover, it is not known 
a priori which of the two will be slightly above or below the other.

A roundabout way to deal with the problem does exist: multiply by some 
power of 10 representing desired precision in number of decimals, then 
truncate using the function "int", then apply "ranking".

I was just wondering whether it would be worth the programming trouble 
to enhance directly the "ranking" function so as  to "look" only as deep 
(Continue reading)

Mark Scerri | 1 Mar 17:37 2015
Picon

ADF TEST CASE 3

HI,
I am rather new to time series analysis. I am trying to perform a unit root test. I have been advised to perform the test on three cases.
Case 1: no constant term or time trend; True process is a random walk.
Case 2: constant term but no time trend: true process is a random walk.
Case 3: constant term but no time trend; true process is a random walk with drift.

In Gretl, Case 1 is equivalent to performing the test with the "test without constant" checked.
Case 2 is equivalent to performing the test with the "with constant" checked.

For Case 3 I was told to perform case 2 followed by OLS.  I am not sure if this means checking also "show regression results" and if yes how do I proceed then?  If this is not the case what should I do(in simple terms) after the Case 2 test to get the output of Case 3.

Many thanks
M
<div><div dir="ltr">HI,<div>I am rather new to time series analysis. I am trying to perform a unit root test. I have been advised to perform the test on three cases.</div>
<div>Case 1: no constant term or time trend; True process is a random walk.</div>
<div>Case 2: constant term but no time trend: true process is a random walk.</div>
<div>Case 3: constant term but no time trend; true process is a random walk with drift.</div>
<div><br></div>
<div>In Gretl, Case 1 is equivalent to performing the test with the "test without constant" checked.</div>
<div>Case 2 is equivalent to performing the test with the "with constant" checked.</div>
<div><br></div>
<div>For Case 3 I was told to perform case 2 followed by OLS.&nbsp; I am not sure if this means checking also "show regression results" and if yes how do I proceed then?&nbsp; If this is not the case what should I do(in simple terms) after the Case 2 test to get the output of Case 3.</div>
<div><br></div>
<div>Many thanks</div>
<div>M</div>
</div></div>
Daniel Bencik | 28 Feb 21:33 2015
Picon

Re: Regression with ex-ante boundaries imposed on forecasted values (Sven Schreiber)

>Am 24.02.2015 um 13:53 schrieb Daniel Bencik:
>> Dear forum,
>>
>&g t; this is more of a question related to econometrics. When, for
>> example, your goal is to model/forecast weekly highs/weekly lows,
>> when you run your regression on Tuesdays, you already know that the
>> model should not predict a weekly high below the Monday's high. The
>> Wednesday's prediction of the whole week's high should not be below
>> max(mondayHigh, tuesdayHigh). My questions is whether there is an
>> econometric tool/approach that is capable of estimating a model
>> bearing this in mind. That is, I want the estimated coefficients to
>> take into account, that the forecasts should not be below/above some
>> value which changes over time (i.e. it is not a constant like e.g.
>> zero or something).
>>
>
>If I understand your question correctly, there is a trivial solution, 
>although you may not like it: Produce forecasts with standard tools, and 
>then apply your time-varying max() operator.
>
>Or you could specify your model in terms of squared deviations (or the 
>negative of that) centered on your previous high, and your restriction 
>would hold. I'm not sure that makes much sense, but econometrically it's 
>not a big deal unless you also want to have some other optimality 
>properties.
>
>cheers,
>sven

 

 

Sven, 

 

thank you much. I thought about the second idea. So instead of

 

weeklyHigh[t] = f(....) + a*monHigh[t] + eps

 

I should regress 

 

weeklyHigh[t] - monHigh[t] = g(...) + eps2

 

where eps2 is a positive distributed error, right? I am asking beucase this still poses some issues and mostly does not guarantee that the predicted  weeklyHigh[t] - monHigh[t] < 0.

 

Thank you, 

Daniel

 

 

<div>
<p><span>&gt;</span><span>Am 24.02.2015 um 13:53 schrieb Daniel Bencik:</span><br><span>&gt;</span><span>&gt; Dear forum,</span><br><span>&gt;</span><span>&gt;</span><br><span>&gt;</span><span>&amp;g
 t; this is more of a question related to econometrics. When, for</span><br><span>&gt;</span><span>&gt; example, your goal is to model/forecast weekly highs/weekly lows,</span><br><span>&gt;</span><span>&gt; when you run your regression on Tuesdays, you already know that the</span><br><span>&gt;</span><span>&gt; model should not predict a weekly high below the Monday's 
 high. The</span><br><span>&gt;</span><span>&gt; Wednesday's prediction of the whole week's high should not be below</span><br><span>&gt;</span><span>&gt; max(mondayHigh, tuesdayHigh). My questions is whether there is an</span><br><span>&gt;</span><span>&gt; econometric tool/approach that is capable of estimating a model</span><br><span>&gt;</span><span>&gt; bearing this in mind. That is, I want the estimated coefficients to</span><br><span>&gt;</span><span>&gt; take into account, that the forecasts should not be below/above some</span><br><span>&gt;</span><span>&gt; value which changes over time (i.e. it is not a constant like e.g.</span><br><span>&gt;</span><span>&gt; zero or something).</span><br><span>&gt;</span><span>&gt;</span><br><span>&gt;</span><br><span>&gt;</span><span>If I understand your question correctly, there is a trivial solution,&nbsp;</span><br><span>&gt;</span><span>although you may not like it: Produce forecasts with standard tools, and&nbsp;</span><br><span>&gt;</span><span>then apply your time-varying max() operator.</span><br><span>&gt;</span><br><span>&gt;</span><span>Or you could specify your model in terms of squared deviations (or the&nbsp;</span><br><span>&gt;</span><span>negative of that) centered on your previous high, and your restriction&nbsp;</span><br><span>&gt;</span><span>would hold. I'm not sure that makes much sense, but econometrically it's&nbsp;</span><br><span>&gt;</span><span>not a big deal unless you also want to have some other optimality&nbsp;</span><br><span>&gt;</span><span>properties.</span><br><span>&gt;</span><br><span>&gt;</span><span>cheers,</span><br><span>&gt;</span><span>sven</span></p>

<p>&nbsp;</p>

<p>&nbsp;</p>

<p>Sven,&nbsp;</p>

<p>&nbsp;</p>

<p>thank you much. I thought about the second idea. So instead of</p>

<p>&nbsp;</p>

<p>weeklyHigh[t] = f(....) + a*monHigh[t] + eps</p>

<p>&nbsp;</p>

<p>I should regress&nbsp;</p>

<p>&nbsp;</p>

<p>weeklyHigh[t] - monHigh[t] = g(...) + eps2</p>

<p>&nbsp;</p>

<p>where eps2 is a positive distributed error, right? I am asking beucase this still poses some issues and mostly does not guarantee that the predicted &nbsp;<span>weeklyHigh[t] - monHigh[t] &lt; 0.</span></p>

<p>&nbsp;</p>

<p>Thank you,&nbsp;</p>

<p>Daniel</p>

<p>&nbsp;</p>

<p>&nbsp;</p>
</div>
Artur T. | 27 Feb 19:37 2015

gnuplot: 2nd yaxis

I've got another question regarding gnuplot. At the moment gretl 
automatically draws a 2nd yaxis (if it is not suppressed by the 
"--single-yaxis" option) if a certain criteria is fulfilled. However, is 
there a way to specify manually a separate variable (or a list of vars) 
to a 2nd yaxis?

Of course one could write a function doing so; more concretely 
specifying the necessary lines such as

<gnuplot>
.
.
.
set y2tics
plot \
'-' using 1:($2) axes x1y2 title "LRY (right)" w lines, \
'-' using 1:($2) axes x1y1 title "d_LRY (left)" w lines
.
.
</gnuplot>

but if it could be done within the literal-framework it much simpler... 
This is not a feature request -- writing a script is quickly done as 
I've got some templates for this.

Best,
Artur
Artur T. | 27 Feb 09:10 2015

Fwd: Re: gnuplot: issue plotting monthly data in a panel

Hi Sven,

you're right I should be more precisely. You can find the resulting
graph attached -- it is simply empty. Using gretl's internal
"--time-series" option works but for some reason the time-dates
disappear for a restricted panel and the x-axis is just a sequence from
1 to T.

Artur

On 27.02.2015 09:00, Sven Schreiber wrote:
> Am 27.02.2015 um 08:24 schrieb Artur T.:
>
>> <hansl>
>>      smpl country==1 --restricted
>>      gnuplot x1 x2 tindex --with-lines --single-yaxis --output=" <at> fname" \
>>        { set term eps font 'Helvetica,13' lw 3 ; \
>>        set xdata time ; \
>>        set timefmt "%Y%m" ; set format x "%m/%Y" ; }
>> </hansl>
>>
>> Unfortunately, the x-axis is not correctly shown. The temporary output
>> text-file you can find below. Does anybody have a clue what's wrong here?
>>
> Well, what do you mean by "not correctly shown", what is happening?
>
> -sven

Hi Sven,

you're right I should be more precisely. You can find the resulting
graph attached -- it is simply empty. Using gretl's internal
"--time-series" option works but for some reason the time-dates
disappear for a restricted panel and the x-axis is just a sequence from
1 to T.

Artur

On 27.02.2015 09:00, Sven Schreiber wrote:
> Am 27.02.2015 um 08:24 schrieb Artur T.:
>
>> <hansl>
>>      smpl country==1 --restricted
>>      gnuplot x1 x2 tindex --with-lines --single-yaxis --output=" <at> fname" \
>>        { set term eps font 'Helvetica,13' lw 3 ; \
>>        set xdata time ; \
>>        set timefmt "%Y%m" ; set format x "%m/%Y" ; }
>> </hansl>
>>
>> Unfortunately, the x-axis is not correctly shown. The temporary output
>> text-file you can find below. Does anybody have a clue what's wrong here?
>>
> Well, what do you mean by "not correctly shown", what is happening?
>
> -sven

Artur T. | 27 Feb 08:24 2015

gnuplot: issue plotting monthly data in a panel

Dear all,

I have a panel dataset and want to plot certain variables over time. I 
constructed a time index applying: tindex = year*100 + month, and hence 
I obtain for November 2014 something like "201411".

For plotting, the following command is used under a previously 
restricted dataset (evaluate data only for a single country using):

<hansl>
     smpl country==1 --restricted
     gnuplot x1 x2 tindex --with-lines --single-yaxis --output=" <at> fname" \
       { set term eps font 'Helvetica,13' lw 3 ; \
       set xdata time ; \
       set timefmt "%Y%m" ; set format x "%m/%Y" ; }
</hansl>

Unfortunately, the x-axis is not correctly shown. The temporary output 
text-file you can find below. Does anybody have a clue what's wrong here?

Thanks,
Artur

<tmp_file>
set term post eps enhanced color solid
set output "/home/.../figs/ts_inflation_DE.eps"
set xlabel 'tindex'
set xzeroaxis
set datafile missing "?"
set key left top
set xrange [200068.225:201444.775]
# start literal lines
set term eps font 'Helvetica,13' lw 3
set ylabel 'in percent'
set key bottom horizontal outside
set yrange[-11.4089 : 15.8523]
set xdata time
set timefmt "%Y%b"
set format x "%m %Y"
# end literal lines
plot \
  '-' using 1:($2) title "Overall" w lines , \
  '-' using 1:($2) title "Energy" w lines
200001 ? # 2000:1
200002 ? # 2000:2
200003 ? # 2000:3
200004 ? # 2000:4
200005 ? # 2000:5
200006 ? # 2000:6
200007 ? # 2000:7
200008 ? # 2000:8
200009 ? # 2000:9
200010 ? # 2000:10
200011 ? # 2000:11
200012 ? # 2000:12
200101 1.307189542 # 2001:1
200102 1.739130435 # 2001:2
200103 1.739130435 # 2001:3
200104 2.176278564 # 2001:4
.
.
.
e
200001 ?
200002 ?
200003 ?
200004 ?
200005 ?
200006 ?
200007 ?
200008 ?
200009 ?
200010 ?
200011 ?
200012 ?
200101 9.408602151
200102 11.88251001
200103 8.842652796
.
.
.
e
</tmp_file>
Randy Kesselring | 27 Feb 00:52 2015

unbalanced panel data problem

At first this seemed a simple problem to me, but I have failed to find a solution.  I have a very large unbalanced panel data set that I am having difficulty declaring as such.  The data set includes a numerical year variable and a numerical month variable (1-12).  Then, there are a large number of observations whose numbers vary per month.  Gretl had no problem reading the data set, but it was read as undated.  At first I thought I simply needed a time index that included both the year and month information in a single variable.  So, I created this:  series tindex = year + month/100.  This created the expected numerical variable.  But, when I tried to use this as an index variable in the panel data structure declaration, it doesn’t show up in the drop down menu.  This made me hesitant to try it in a Hansl script.  I’m sure that I have misunderstood something quite simple, but I would appreciate some direction.

 

 

 

<div><div class="WordSection1">
<p class="MsoNormal">At first this seemed a simple problem to me, but I have failed to find a solution.&nbsp; I have a very large unbalanced panel data set that I am having difficulty declaring as such.&nbsp; The data set includes a numerical year variable and a numerical month variable (1-12).&nbsp; Then, there are a large number of observations whose numbers vary per month.&nbsp; Gretl had no problem reading the data set, but it was read as undated.&nbsp; At first I thought I simply needed a time index that included both the year and month information in a single variable.&nbsp; So, I created this:&nbsp; series tindex = year + month/100.&nbsp; This created the expected numerical variable.&nbsp; But, when I tried to use this as an index variable in the panel data structure declaration, it doesn&rsquo;t show up in the drop down menu.&nbsp; This made me hesitant to try it in a Hansl script.&nbsp; I&rsquo;m sure that I have misunderstood something quite simple, but I would appreciate some direction.<p></p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
<p class="MsoNormal"><p>&nbsp;</p></p>
</div></div>
Riccardo (Jack) Lucchetti | 25 Feb 17:54 2015
Picon

Re: Please Professor Lucchetti: question concerning ARMA model with Garch errors in Gretl

On Wed, 25 Feb 2015, kaffel bilel wrote:

> Dear Professor Lucchetti,Hello,I hope that I am notdisturbing you
> I am Bilel Kaffel.

[...]

> My problem is: I want to know how can I estimate the parameters of an 
> ARMA model with Garch errors in Gretl?. For exemple : ARMA(1,2) model 
> with Garch(1,1) errors. The parameters will need to be estimate 
> simultaneously. Please Professor: can you help me?

Two things:

(1) I strongly advise you to subscribe to the gretl user list and ask 
these questions there: not only you have a much better chance of receiving 
an answer in a short time, you will also benefit the gretl community 
because your question (and the subsequent answers) will be archived and 
made public (and, most importantly, googleable). For this reason, I'm 
replying to you with the user mailing list in cc. You can browse the 
archives here:

http://lists.wfu.edu/pipermail/gretl-users/

(2) Here is a sample script that estimates an arma(1,1) + garch(1,1). I'm 
sure that by studying this example together with the gretl manuals you'll 
find it quite easy to generalise it to the case of your interest:

<hansl>
set echo off
set messages off

function series arma_garch_flt(series y, matrix param, scalar p, scalar q)
     scalar m = param[1]
     matrix arpar = param[2:p+1]
     matrix mapar = param[p+2:p+q+1]
     scalar c = param[p+q+2]
     scalar a = param[p+q+3]
     scalar b = param[p+q+4]

     series ret = NA

     # --- checks
     # stationarity
     matrix roots = polroots(1 | -arpar)
     check = minc(abs(sumr(roots.^2))) > 1
     # invertibility
     matrix roots = polroots(1 | mapar)
     check = check && minc(abs(sumr(roots.^2))) > 1
     # garch param
     check = check && (c>0) && (a > 0) && (b>=0) && (a+b<1)

     if check
         scalar hlag = c/(1-a-b)
         series ret = -0.5*ln(2*$pi) # the loglikelihood
         series e = filter(y - m, 1 | -arpar, -mapar, m)
         series e2 = e^2
         scalar h_unc = c/(1-a-b)
         series e2lag = ok(e(-1)) ? e(-1)^2 : h_unc
         series h = filter(c + a*e2lag, 1, b)
         series ret = -0.5*(2*$pi) - 0.5*ln(h) - 0.5*e2/h
     endif

     return ret
end function

# ------------------------------------------------------------------------

open djclose.gdt --quiet

series r = 100 * ldiff(djclose)
arima 1 1 ; r
armapar = $coeff
garch 1 1 ; r
garchpar = $coeff[2:]

param = armapar | garchpar

mle ll = arma_garch_flt(r, param, 1, 1)
     params param
end mle -v
</hansl>

-------------------------------------------------------
   Riccardo (Jack) Lucchetti
   Dipartimento di Scienze Economiche e Sociali (DiSES)

   Università Politecnica delle Marche
   (formerly known as Università di Ancona)

   r.lucchetti@...
   http://www2.econ.univpm.it/servizi/hpp/lucchetti
-------------------------------------------------------
On Wed, 25 Feb 2015, kaffel bilel wrote:

> Dear Professor Lucchetti,Hello,I hope that I am notdisturbing you
> I am Bilel Kaffel.

[...]

> My problem is: I want to know how can I estimate the parameters of an 
> ARMA model with Garch errors in Gretl?. For exemple : ARMA(1,2) model 
> with Garch(1,1) errors. The parameters will need to be estimate 
> simultaneously. Please Professor: can you help me?

Two things:

(1) I strongly advise you to subscribe to the gretl user list and ask 
these questions there: not only you have a much better chance of receiving 
an answer in a short time, you will also benefit the gretl community 
because your question (and the subsequent answers) will be archived and 
made public (and, most importantly, googleable). For this reason, I'm 
replying to you with the user mailing list in cc. You can browse the 
archives here:

http://lists.wfu.edu/pipermail/gretl-users/

(2) Here is a sample script that estimates an arma(1,1) + garch(1,1). I'm 
sure that by studying this example together with the gretl manuals you'll 
find it quite easy to generalise it to the case of your interest:

<hansl>
set echo off
set messages off

function series arma_garch_flt(series y, matrix param, scalar p, scalar q)
     scalar m = param[1]
     matrix arpar = param[2:p+1]
     matrix mapar = param[p+2:p+q+1]
     scalar c = param[p+q+2]
     scalar a = param[p+q+3]
     scalar b = param[p+q+4]

     series ret = NA

     # --- checks
     # stationarity
     matrix roots = polroots(1 | -arpar)
     check = minc(abs(sumr(roots.^2))) > 1
     # invertibility
     matrix roots = polroots(1 | mapar)
     check = check && minc(abs(sumr(roots.^2))) > 1
     # garch param
     check = check && (c>0) && (a > 0) && (b>=0) && (a+b<1)

     if check
         scalar hlag = c/(1-a-b)
         series ret = -0.5*ln(2*$pi) # the loglikelihood
         series e = filter(y - m, 1 | -arpar, -mapar, m)
         series e2 = e^2
         scalar h_unc = c/(1-a-b)
         series e2lag = ok(e(-1)) ? e(-1)^2 : h_unc
         series h = filter(c + a*e2lag, 1, b)
         series ret = -0.5*(2*$pi) - 0.5*ln(h) - 0.5*e2/h
     endif

     return ret
end function

# ------------------------------------------------------------------------

open djclose.gdt --quiet

series r = 100 * ldiff(djclose)
arima 1 1 ; r
armapar = $coeff
garch 1 1 ; r
garchpar = $coeff[2:]

param = armapar | garchpar

mle ll = arma_garch_flt(r, param, 1, 1)
     params param
end mle -v
</hansl>

-------------------------------------------------------
   Riccardo (Jack) Lucchetti
   Dipartimento di Scienze Economiche e Sociali (DiSES)

   Università Politecnica delle Marche
   (formerly known as Università di Ancona)

   r.lucchetti@...
   http://www2.econ.univpm.it/servizi/hpp/lucchetti
-------------------------------------------------------
Daniel Bencik | 24 Feb 17:46 2015
Picon

VECM/VAR estimation using GLS

Dear forum, 
 
I have a VECM, residuals of both equations show a correlation of approx 0.55 - is there a way in gretl where I
can estimate a VECM/VAR using GLS so that I factor in the covar structure of residuals?
 
Many thanks, 
Daniel
 

_______________________________________________
Gretl-users mailing list
Gretl-users <at> lists.wfu.edu
http://lists.wfu.edu/mailman/listinfo/gretl-users
Daniel Bencik | 24 Feb 13:53 2015
Picon

Regression with ex-ante boundaries imposed on forecasted values

Dear forum, 
 
this is more of a question related to econometrics. When, for example, your goal is to model/forecast
weekly highs/weekly lows, when you run your regression on Tuesdays, you already know that the model
should not predict a weekly high below the Monday's high. The Wednesday's prediction of the whole week's
high should not be below max(mondayHigh, tuesdayHigh). My questions is whether there is an econometric
tool/approach that is capable of estimating a model bearing this in mind. That is, I want the estimated
coefficients to take into account, that the forecasts should not be below/above some value which changes
over time (i.e. it is not a constant like e.g. zero or something). 
 
Many thanks, 
Daniel
 

_______________________________________________
Gretl-users mailing list
Gretl-users <at> lists.wfu.edu
http://lists.wfu.edu/mailman/listinfo/gretl-users
Wingenroth, Thorsten | 23 Feb 15:33 2015
Picon

Re: graph is crazy - thank you!

Thanks to all, especially Allin. Works great now!

Kind regards,

Thorsten

-----Ursprüngliche Nachricht-----
Von: gretl-users-bounces@...
[mailto:gretl-users-bounces@...u.edu] Im Auftrag von Allin Cottrell
Gesendet: Sonntag, 22. Februar 2015 18:10
An: r.lucchetti@...; Gretl list
Betreff: Re: [Gretl-users] graph is crazy

On Sun, 22 Feb 2015, Riccardo (Jack) Lucchetti wrote:

> On Sun, 22 Feb 2015, Allin Cottrell wrote:
>
>> A fix shouldn't be too far away. But this time I think I'll put in a
>> "backstop": if we find ourselves having to construct a "string table" 
>> for data that are composed of nothing but digits, dot and comma, we 
>> should flag an error and give up instead of accepting the data as string-valued.
>
> Excellent idea.

That's now in CVS and snapshots.

The problem with reading Thorsten's new data file was simpler than I guessed (and nothing to do with dates).
There was an oversight in the code that meant it was not general enough: we got lucky with the first file
Thorsten posted, the second one exposed the flaw. Now fixed.

Allin
_______________________________________________
Gretl-users mailing list
Gretl-users@...
http://lists.wfu.edu/mailman/listinfo/gretl-users


Gmane