Forum Navigation:

magazine

FORUMS > Student Forum < refresh >
Topic Title: Martingale and Markov process
Created On Wed Nov 12, 03 12:46 PM
Topic View:

View thread in raw text format


SPAAGG
Senior Member

Posts: 784
Joined: Mar 2003

Wed Nov 12, 03 12:46 PM
User is offline View users profile

Hi !

Could someone help me to prove or disprove:

A is a martingale => A is markov process
A is a markov process => A is a martingale

How can I do to solve this problem,

Thanks in advance!
 
Reply
   
Quote
   
Top
   
Bottom
     



mj
Senior Member

Posts: 3412
Joined: Dec 2001

Wed Nov 12, 03 03:00 PM
User is offline View users profile

they are both false.



-------------------------
Course on Kooderive -- GPU programming with 100 times speed up in October 2014. www.markjoshi.com
 
Reply
   
Quote
   
Top
   
Bottom
     



asd
Senior Member

Posts: 662
Joined: Aug 2002

Fri Nov 14, 03 12:58 AM
User is offline

"A is a martingale => A is markov process"

I know I am wrong, but here is my intuition which says that the above is true:

- A process is Martingale => Expected value for tommorrow is known today(which is today's observed value) => Expected value of tommorrow is same for all possible sigma algebras that will take place today => process is Markovian

Please help me to correct my mistake.

Thanks,
asd

-------------------------
Life is a journey and not a destination
quant code directory

Edited: Fri Nov 14, 03 at 12:59 AM by asd
 
Reply
   
Quote
   
Top
   
Bottom
     



quantie
Senior Member

Posts: 905
Joined: Oct 2001

Fri Nov 14, 03 04:18 AM
User is offline View users profile

Quote

Originally posted by: SPAAGG
Hi !

Could someone help me to prove or disprove:

A is a martingale => A is markov process
A is a markov process => A is a martingale

How can I do to solve this problem,

Thanks in advance!


These are two different notions

Markov processes are random processes that have no memory. Whereas martingales have their average value constant in a strong sense.

There are some trivial cases where we could have a markov chain and the chain itself would be a martingale; nevertheless with every markov chain there is charateristic collection of martingales.

So for a simple random walk (Xn) on Z from 0 ; both f(i) = i and g(n,i) = i^2 - n ;
both f(Xn) and g(n,Xn) are martingales.


Edited: Fri Nov 14, 03 at 05:26 PM by quantie
 
Reply
   
Quote
   
Top
   
Bottom
     



sam
Senior Member

Posts: 703
Joined: Dec 2001

Fri Nov 14, 03 11:32 AM
User is offline View users profile

This has been discussed loads of times in the forums. A search will take you to the pages you need.

Just think of some examples:

1. A biased coin (scoring +1 for H, -1 for T). Not a martingale, but still markov... therefore we disprove the second point.

2. Ito integral:

dY = f(Y). dz

This is a martingale, this is a basic property if Ito integrals. But if f is a function of the path that Y takes then it is not markov. Going back to the coin example:

Now you have a fair coin but more complex rules:

if your previous throw was a head then you get +1 for H, -1 for T on the next throw.
If your previous throw was a tail then you get +2 for H and -2 for T on the next throw.

Your score is still a martingale, but no longer markov because the actual scores that are possible depend on what you have thrown already.

Regards,

Sam

Edited: Fri Nov 14, 03 at 11:32 AM by sam
 
Reply
   
Quote
   
Top
   
Bottom
     



SPAAGG
Senior Member

Posts: 784
Joined: Mar 2003

Fri Nov 14, 03 01:07 PM
User is offline View users profile

thanks a lot for your reply.
 
Reply
   
Quote
   
Top
   
Bottom
     



elan
Senior Member

Posts: 350
Joined: Apr 2003

Fri Nov 14, 03 01:33 PM
User is offline View users profile

Elementary explicit counterexamples:
1. The process dX_t = (\int_0^t X_s ds) dW_t is a martingale but is not Markov.
2. The process dX_t = a dt + \sigma dW_t is Markov but is not a martingale.
In the equations above, W_t denotes Brownian motion.

-------------------------
For whosoever hath, to him shall be given
 
Reply
   
Quote
   
Top
   
Bottom
     



amitkrkedia30
Junior Member

Posts: 1
Joined: Sep 2005

Sat Oct 08, 05 03:55 PM
User is offline

hi
can someone tell me if a roulette wheel can be considered as a markov process
 
Reply
   
Quote
   
Top
   
Bottom
     



Stochastic44
Member

Posts: 58
Joined: Nov 2005

Fri Dec 09, 05 12:56 PM
User is offline

I think you can consider it a Markov process even if it has no practical interest
I'm not sure but it must be a markov process with a [36x36] transition matrix with all elements = 1/36.
(homework) You can reconsider the sum of the single output as the time varying random variable and try to find out if it's markov just as a in a the head/tail case.



-------------------------
It is easier to rob by setting up a bank than by holding up a bank clerk.
BB
 
Reply
   
Quote
   
Top
   
Bottom
     

View thread in raw text format
FORUMS > Student Forum < refresh >

Forum Navigation:

© All material, including contents and design, copyright Wilmott Electronic Media Limited - FuseTalk 4.01 © 1999-2014 FuseTalk Inc. Terms & Conditions