Forum Navigation: Select WHO'S ON: 05:38 PM General Forum Technical Forum Economics Forum Numerical Methods Forum Trading Forum The Quantitative Finance FAQs Proje... Student Forum Book And Research Paper Forum Programming and Software Forum The Quantitative Finance Code Libra... Careers Forum Events Board Brainteaser Forum Off Topic Forum and Website Bugs and Suggesti... Wilmott / BookMap Competition

 new topic
 search
 jobs board
 magazine
 news
 help
 file share
 audio visual
 articles
 blogs
 wiki
 forums
 home

 FORUMS > Student Forum < refresh >
 Topic Title: Martingale and Markov process Created On Wed Nov 12, 03 12:46 PM Topic View: Branch View Threaded (All Messages) Threaded (Single Messages) Linear

SPAAGG
Senior Member

Posts: 784
Joined: Mar 2003

Wed Nov 12, 03 12:46 PM

Hi !

Could someone help me to prove or disprove:

A is a martingale => A is markov process
A is a markov process => A is a martingale

How can I do to solve this problem,

mj
Senior Member

Posts: 3445
Joined: Dec 2001

Wed Nov 12, 03 03:00 PM

they are both false.

-------------------------
Proof patterns is now out.

asd
Senior Member

Posts: 662
Joined: Aug 2002

Fri Nov 14, 03 12:58 AM

"A is a martingale => A is markov process"

I know I am wrong, but here is my intuition which says that the above is true:

- A process is Martingale => Expected value for tommorrow is known today(which is today's observed value) => Expected value of tommorrow is same for all possible sigma algebras that will take place today => process is Markovian

Thanks,
asd

-------------------------
Life is a journey and not a destination
quant code directory

Edited: Fri Nov 14, 03 at 12:59 AM by asd

quantie
Senior Member

Posts: 905
Joined: Oct 2001

Fri Nov 14, 03 04:18 AM

Quote

Originally posted by: SPAAGG
Hi !

Could someone help me to prove or disprove:

A is a martingale => A is markov process
A is a markov process => A is a martingale

How can I do to solve this problem,

These are two different notions

Markov processes are random processes that have no memory. Whereas martingales have their average value constant in a strong sense.

There are some trivial cases where we could have a markov chain and the chain itself would be a martingale; nevertheless with every markov chain there is charateristic collection of martingales.

So for a simple random walk (Xn) on Z from 0 ; both f(i) = i and g(n,i) = i^2 - n ;
both f(Xn) and g(n,Xn) are martingales.

Edited: Fri Nov 14, 03 at 05:26 PM by quantie

sam
Senior Member

Posts: 703
Joined: Dec 2001

Fri Nov 14, 03 11:32 AM

This has been discussed loads of times in the forums. A search will take you to the pages you need.

Just think of some examples:

1. A biased coin (scoring +1 for H, -1 for T). Not a martingale, but still markov... therefore we disprove the second point.

2. Ito integral:

dY = f(Y). dz

This is a martingale, this is a basic property if Ito integrals. But if f is a function of the path that Y takes then it is not markov. Going back to the coin example:

Now you have a fair coin but more complex rules:

if your previous throw was a head then you get +1 for H, -1 for T on the next throw.
If your previous throw was a tail then you get +2 for H and -2 for T on the next throw.

Your score is still a martingale, but no longer markov because the actual scores that are possible depend on what you have thrown already.

Regards,

Sam

Edited: Fri Nov 14, 03 at 11:32 AM by sam

SPAAGG
Senior Member

Posts: 784
Joined: Mar 2003

Fri Nov 14, 03 01:07 PM

elan
Senior Member

Posts: 351
Joined: Apr 2003

Fri Nov 14, 03 01:33 PM

Elementary explicit counterexamples:
1. The process dX_t = (\int_0^t X_s ds) dW_t is a martingale but is not Markov.
2. The process dX_t = a dt + \sigma dW_t is Markov but is not a martingale.
In the equations above, W_t denotes Brownian motion.

-------------------------
For whosoever hath, to him shall be given

amitkrkedia30
Junior Member

Posts: 1
Joined: Sep 2005

Sat Oct 08, 05 03:55 PM

hi
can someone tell me if a roulette wheel can be considered as a markov process

Stochastic44
Member

Posts: 58
Joined: Nov 2005

Fri Dec 09, 05 12:56 PM

I think you can consider it a Markov process even if it has no practical interest
I'm not sure but it must be a markov process with a [36x36] transition matrix with all elements = 1/36.
(homework) You can reconsider the sum of the single output as the time varying random variable and try to find out if it's markov just as a in a the head/tail case.

-------------------------
It is easier to rob by setting up a bank than by holding up a bank clerk.
BB