Welcome to the 1520 Perfect Dictionary

Click on any title to read the full article

Markov chain
MARKOV CHAIN

Definition: (statistics) The aspect of probability theory that analyzes discrete states in which transition is a fixed probability not affected by the past history of the system; named for Andrei A. Markov, Russian mathematician.
_____________________________________

See perfect aspect (1).
___

See perfect particularity (1).
___

See perfect analysis (1).

1520 Products

1520 Products was established in 2005 with the purpose of entertaining and teaching us on key and important aspects of life (such as marriage, sex, etc) through the playing of games which will allow us to laugh but at the same time pass a message of what is the right or ideal way.

1520 Sex Game

1520 Puzzles

1520 Marriage Game