Join CMT Level 1, 2 & 3 Program Courses & become a professional Technical Analyst, CMT USA Best COACHING CLASSES. CMT Institute Live Classes by Expert Faculty. Exams are available in India. Best Career in Financial Market.
https://www.ptaindia.com/chartered-market-technician/
3. Learning Objectives
âȘ Analyze the relationship between a systemâs entry signals and
changes in market volatility
âȘ Distinguish whether a systemâs entry signal should be filtered
based on liquidity
âȘ Calculate the expected move of an index or security based on
volatility measures
âȘ Explain the basics of using Fractal Efficiency, Chaos Theory or
genetic algorithms in trading
âȘ Explain the basics of using Neural Network (Machine Learning)
programming to trade with market data
3
4. Measuring Volatility
âȘ Volatility of most price series, whether stocks, financial markets, commodities, or
the spread between two series, is directly proportional to the increase and
decrease in the price level.
âȘ Higher prices translate into higher volatility.
âȘ This price-volatility relationship has been described as lognormal in the stock
market, and is similar to a percentage-growth relationship
âȘ Log relationship is one way of explaining the relationship between price and
volatility.
âȘ A linear price scale is plotted on the side of the chart there is an equal distance
between the prices, and each unit change on the chart is represented by the
same vertical distance on the scale, regardless of what price level the asset is at
when the change occurs.
âȘ A logarithmic price scale is plotted so that the prices in the scale are not
positioned equidistantly; instead, the scale is plotted in such a way that two
equal percent changes are plotted as the same vertical distance on the scale.
âȘ
4
5. The Price-Volatility Relationship
âȘ Two stocks trading at the same price can have very different
volatility. It may not only be the nature of their business, but
those stocks in the news tend to have more volatility.
- When sizing positions in order to equalize risk.
- When balancing position sizes for pairs trading or market
neutral baskets.
- For assessing risk in portfolios.
âȘ While each of these issues is covered in other places in this
book, it is volatility that is the key driver of risk, and measuring
volatility correctly is necessary to control that risk.
5
6. Adjusting for a Base Price
âȘ Somewhere below the cost of production, volatility goes to zero. Call that level
the base price for the purpose of calculating volatility.
âȘ Then prices can be adjusted by dividing the current price by the base price (p0) or
subtracting the base price from the current price, then taking the natural log (ln)
of the adjusted value.
âȘ The charts of either of these adjusted prices will look similar to the unadjusted
semi-log plot; however, when the base price is high, the adjusted price will be
more useful.
6
7. Adjusting for a Base Price
Exceptions:
âȘ 1. Interest rates trade as prices on the futures markets, which is inverse
to the yield. When evaluating long-term volatility for rates, and in most
cases when using percentages, yield should be used.
âȘ 2. Foreign exchange has no base price, only equilibrium, the price that all
traders, and the governments, accept as fair value for the moment. This
situation is always short-lived. Prices get more volatile when they move
away from equilibrium in either direction.
âȘ 3. Energy is controlled by a cartel. They attempt to set the supply and
target a price range.
7
8. Determining the Base Price
Two straightforward ways :
âȘ Use a linear regression of prices. Apply a standard least squares
regression, available on Excel in Data/Data Analysis, with sequential
integers as x and the closing prices as y.
âȘ Then create the regression line as p-t = a + bx, where a is the y-
intercept and b the slope. To find the base price, calculate the residuals,
rest = pt â p-t then find the minimum residual value. Subtract that
minimum value from the regression line to get the base price line, The
base price continues to increase with time.
âȘ 2. Use a linear regression of volatility. Instead of price, using a
measure of volatility related to the price level will give a more direct
view of the price volatility relationship.
8
9. Determining the Base Price
9
Scatter Diagrams of Volatility Against Price for Monthly Copper, 1974 to July
2011. (a) Price and Monthly Returns. (b) Price and Monthly Price Range
10. The Time Interval
âȘ The time period over which volatility is measured is also a significant
factor in the price-volatility relationship. A longer period means that the
net changes over n days may turn into months or years.
âȘ Longer measurement periods give higher volatility values; however, the
rate at which volatility increases will decline over time.
10
Change in Volatility Relative to the Interval Over Which It Is Measured.
11. An Example of a Lognormal Calculation
âȘ Over the long term and under average market conditions, the
relationship between actual price changes and volatility is expected to
be:
âȘ For example, if the price on day t is 20 and the price on day s is 40, then
the natural log of prices on those days are ln (20) = 2.99 and ln (40) =
3.99. If the volatility is $1.00 when pt = 20, then the volatility is expected
to be $1.23 when ps = 40.
- If the volatility at 20 is 1.00, then the volatility at 40 is 1.23. When using
a spreadsheet for your calculations, note that the function ln is not the
same as the function log.
11
12. Volatility Measures
12
The change in
price over n
days
The maximum
price fluctuation
during the n
days
The average
true range
The sum of the
absolute price
changes over n
days
Classic
annualized
volatility for daily
data
14. Volatility Measures
14
1. The change in price over n days (Figure 27.5a):
2. The maximum price fluctuation during the n days (Figure 27.3b):
where Max and Min are the same as the Trade Station functions Highest and Lowest.
3. The average true range over the past n days:
where True range is a function that returns the maximum range from the combination of
todayâs high, low, and previous close.
4. The sum of the absolute price changes over n days:
For stocks, the sum of the returns should be used.
5. Classic annualized volatility for daily data
15. Comparing Annualize Volatility and Average True Range
âȘ Results from annualized volatility and average true range can be very different
and can significantly affect trading decisions, position size, and risk assessment.
âȘ The ATR is considerably smoother on a daily basis and shows smaller jumps when
prices gap. In some cases, such as the far right, the true range is increasing while
the annualized volatility is flat.
âȘ Both can be converted to dollar values by multiplying by the current price.
15
16. Relative Volatility
âȘ Relative volatility (RV) can be defined as the volatility over a short period divided by the volatility
over a longer period, where the longer period is typical of the normal volatility,
âȘ where Vt is any of the volatility measures, and n and m are calculation periods, where f Ă n = m,
where f >= 5, but f >= 10 would be better.
âȘ Lagging the Longer Period
âȘ A better measure would be to lag the longer calculation so that it ends before the shorter one
starts,
âȘ Then the shorter calculation, n, goes from t â n + 1 to t, and the longer one, m, goes from t â m â
n + 1 to t â n, non-overlapping periods.
âȘ This method will also help on the back side of a volatile period, when the typical calculation
includes the recent volatility, making the declining volatility seem normal, rather than still
volatile.
16
17. Implied Volatility, VIX
âȘ The CBOEâs volatility index, VIX, reflecting the implied volatility of the S&P options, is
available on a real-time basis.
âȘ VIX was originally the volatility of an options index, OEX, a weighted value of the
implied volatilities of 8 puts and calls in the S&P 100, expressed as a percentage
of the index price.
âȘ If the VIX is 25%, and the SPX is 1600, then VIX is forecasting 25% volatility for at-
the-money options, relative to the price of the SPX for 30-day rolling expiration
period.
âȘ The 30-day calendar period is equivalent to about 21 trading days, and there are
252 trading days in the year, 1 standard deviation of the volatility (equal to a 25%
move in the SPX at 1600) becomes
âȘ Then, an implied volatility of 25% when the SPX is at 1600 is equal to a 68% (1
standard deviation) chance of a price change of ±115.47 within the next 30
calendar days (21 trading days).
17
18. Intraday Volatility and Volume
âȘ Intraday volatility has a pattern that is identical to volume, highest at the
open, then declining to its lowest point at mid-session, and rising again
as the trading day ends.
âȘ To find the correlation between the intraday pattern of volatility and that
of volume, a simple linear regression can be solved
âȘ where all values are calculated at time t. The resulting correlation, R =
0.595, is statistically significant for NASDAQ 100 volatility and volume.
The correlation between volatility and volume are highest at the
beginning and end of the day.
18
19. Intraday Volatility and Volume
âȘ Meissner and Cercioglu suggest
that this volatility pattern, with
the corresponding volume that
provides liquidity, can be traded
by being long options at the
beginning and end of the day,
profiting from the gamma (the
rate of change of delta, which is the
rate of change of the futures price
with respect to the rate of change
of the underlying asset).
âȘ During the quiet mid-session
period, a short options position
may be used to profit from theta,
the time decay.
19
20. Predicting Volatility with Trading Ranges
20
Thomas
ByronicâOn-
Balance True
Range
VIX Trading
Systems -
Connors
MarketSci Blog
Gerald Appel on
VIX
Fractals, Chaos,
and Entropy
Trends and
Price Noise
Trends and
Interest Rate
Carry
Neural Networks
Modeling
Human
Behavior
Genetic
Algorithms
Liquidity
Trade Selection
Using Volatility
21. Thomas ByronicâOn-Balance True Range
âȘ To visualize the change in volatility, Thomas Bierovic has created
an On-Balance True Range by following the same rules as On-
Balance Volume (OBV), but substituting the true range calculation
for volume.
âȘ He then calculates a 9-day exponential smoothing of the On-
Balance True Range and uses the crossovers of the oscillator and
smoothed oscillator to confirm signals.
âȘ The highs and lows may come at nearly the same time as other
oscillators, the relative peaks and valleys may offer the trader
new insights. For many traders, this simple interpretation can help
separate high and low volatility conditions.
21
22. VIX Trading Systems - Connors
âȘ VIX is considered a mean reverting indicator
âȘ Connors - Larry Connors has based a number of trading systems on the
VIX.
âȘ Connors treats volatility as mean-reverting. Entries are based on a minor
reversal in the VIX. The rules for buying (selling are the reverse) are:
âȘ 1. Todayâs VIX high must be higher than the VIX high of the past 10 days.
âȘ 2. Todayâs VIX must close below its open.
âȘ 3. Yesterdayâs VIX must have closed above its open.
âȘ 4. Todayâs VIX range must be greater than the ranges of the past 3 days.
âȘ 5. If conditions 1â4 are met then buy S&P futures on the close and exit in 3
days.
âȘ Connors is actually looking for turning points in the VIX.
22
23. VIX Trading Systems - Connors
âȘ Connors mean-reverting VIX strategy uses the RSI for timing. If
âȘ 1. The S&P > 200-day moving average
âȘ 2. The 2-day RSI of the VIX > 90
âȘ 3. Todayâs VIX open > yesterdayâs VIX close
âȘ then buy on the close and exit when the 2-day RSI closes > 65.
âȘ The specific pattern that precedes a buy signal in the S&P ends with a
range expansion.
âȘ This expansion is likely to mark the end of a short-term upwards move in
the VIX.
âȘ A decline in the VIX that follows eases the way for a short-term rally in the
S&P.
23
24. MarketSci Blog
âȘ Interesting websites is MarketSci Blog, which offers numerous
creative strategies for equities trading.
âȘ a 10-day exponential smoothing (EMA) and a 10-day simple moving
average (SMA), both applied to the VIX index, and buys the VIX when
the EMA falls below the SMA, then sells short when the EMA moves
above the SMA.
âȘ It does this based on the concurrent closing prices.
âȘ A fast execution is essential for mean-reverting trades
âȘ Unlike most other strategies, this is remarkably symmetric, with
longs and shorts performing equally.
24
25. Gerald Appel on VIX
âȘ Gerald Appel gives his own thoughts on trading VIX as:
âȘ Buy when there are high levels of VIX, implying broad pessimism.
âȘ There are no reliable sell signals using VIX.
âȘ Volatility tends to increase during weaker market climates.
âȘ The stock market is likely to advance for as long as volatility remains stable or
decreasing.
âȘ Volatility System
âȘ Book staber uses the average true range (ATR) over the past n days as the basis for a
simple volatility strategy:
âȘ Buy if the next close, Ct+1, rises by more than k Ă ATRt(n) from the current close Ct
âȘ Sell if the next close, Ct+1, falls by more than k Ă ATRt(n) from the current close Ct
âȘ The volatility factor k is given as approximately 3, but can be varied higher or lower to
make the trading signals less or more frequent, respectively. This method falls into
the category of volatility breakout.
25
26. Trends and Price Noise
âȘ The trader who enters a new trend sooner will be more profitable, but the
erratic behavior of prices makes a faster response less reliable. What makes a
trend so difficult to identify is noise.
âȘ Noise is the erratic movement of price and, by definition, it is unpredictable. In
engineering, this type of behavior that shows no patterns is called white noise
âȘ When measuring noise using the efficiency ratio, or fractal efficiency, different
markets had different levels of noise, but that the equity index markets had the
most noise and short-term rates the least.
âȘ Noise is the product of a large number of market participants buying and selling at
different times for different purposes. Each has its own objectives and time
frame.
âȘ Noise can be the result of price shocksâunexpected events, in particular news
or economic reports, that causes change that persists for varying periods into
the future. Noise has most of the qualities of a sequence of random numbers.
26
27. Trade Selection Summary
âȘ Every system has a risk, even so-called riskless trades. Arbitrage,
when done properly, has virtually no risk; however, it may be so
competitive that the opportunities are rare and the margin of profit
small.
âȘ In the final analysis, you canât remove the risk, only delay it or move
it around. If your system shows that it has essentially no risk, then
itâs important to rethink your development process to find the flaw.
âȘ When selecting trades to eliminate, the easiest place to begin is by
associating performance with volatility or price level.
âȘ While some systems perform better in an environment of higher
volatility, your strategy may show the best return relative to risk
when there is less volatility. When you filter trades you will always
get rid of good ones while, hopefully, removing more of the bad
ones.
27
28. Trade Selection Using Volatility
28
High Volatility
Eliminate or
Delay?
Constructing a
Volatility Filter
Standard
Deviation
Measurement
Entry Filter
Results
High Volatility
Exits: Reducing
the Risk
Ranking Based
on Volatility
Trade Selection
Summary
29. Trade Selection Using Volatility
âȘ High volatility is clearly related to greater risk, but low volatility
may also mean that there is a smaller chance for profits.
âȘ The reasonable expectations for selecting trades based on
volatility:
âȘ Entering on very high volatility is exposure to very high risk. Returns
from high volatility trades may range from large profits to large
losses.
âȘ Entering on extreme low volatility seems safe, but prices often have no
direction and produce small, frequent losses. Waiting for an
increase in activity before entering might improve returns.
âȘ Exiting a position when prices become very volatile should reduce both
profits and risk, but may come too late.
29
30. Eliminate or Delay?
âȘ At the time of an entry signal, there are two choices.
âȘ The trade can be completely eliminated by filtering,
âȘ Or it can be delayed until the high volatility drops or the low
volatility increases to an acceptable level.
âȘ Before starting, we can theorize that short-term trading would
most likely eliminate, not delay, trades that fall outside the
acceptable volatility range because there are many trades and each
is held for a short time.
âȘ At the other end of the spectrum are the long-term trend trades,
held for weeks, that would suffer if the exceptionally large profit
was missed.
30
31. Constructing a Volatility Filter
âȘ Calculating the volatility is simple to program using any spreadsheet or
strategy testing software.
The following steps were used here:
âȘ 1. Calculate a moving average trend. Use one fast trend and one slow
trend.
âȘ 2. Calculate the volatility, using any one of the methods described early in
this chapter, but not including the volatility of the current day.
âȘ 3. Enter a new trade (on the close) if todayâs volatility is (a) above the low
filter threshold or (b) below the high filter threshold.
âȘ 4. Exit a current position if the volatility is above the high filter threshold
and, based on testing,
âȘ (a) the current price change has moved in a profitable direction or
âȘ (b) the current price change has moved in a losing direction.
31
32. Standard Deviation Measurement
âȘ A standard deviation was used to determine the volatility threshold
level because these levels are associated with probabilities.
âȘ A high-volatility filter with a 1- standard deviation threshold means that
no trades were taken if the volatility was above the average volatility
plus 1 standard deviation, the top 16% occurrences.
âȘ A 2-standard deviation threshold filters out volatility in the top 2.5%, and 3
standard deviations restricts only the top 0.13%.
âȘ In all cases, a 20-day standard deviation will be used, comparable to
VIX.
âȘ The program TSM Moving Average will used as the underlying strategy.
âȘ This enters and exits trades based entirely on the direction of the
moving average trend line, not on the price penetration of the trend
line.
32
33. Entry Filter Results
âȘ A standard deviation was used to determine the volatility threshold
level because these levels are associated with probabilities.
âȘ A high-volatility filter with a 1- standard deviation threshold means that
no trades were taken if the volatility was above the average volatility
plus 1 standard deviation, the top 16% occurrences.
âȘ A 2-standard deviation threshold filters out volatility in the top 2.5%, and 3
standard deviations restricts only the top 0.13%.
âȘ In all cases, a 20-day standard deviation will be used, comparable to
VIX.
âȘ The program TSM Moving Average will used as the underlying strategy.
âȘ This enters and exits trades based entirely on the direction of the
moving average trend line, not on the price penetration of the trend
line.
33
34. Ranking Based on Volatility
âȘ Gerald Appel offers an additional approach to trade selection
by creating a ranking method for mutual funds.
âȘ Select only funds with average to below average volatility.
âȘ Add the 3-month and 12-month performance together to
get single value.
âȘ Rank the funds.
âȘ Only invest in the top 10%.
34
35. Liquidity
âȘ One reason for this disappointing performance is the lack of understanding of
market liquidity at the time of execution.
âȘ Consider two systems:
âȘ 1. A trend-following method, which will trigger buy or sell orders as prices rise or fall
during the day.
âȘ 2. A countertrend system, which sells and buys at relative intraday highs or lows.
35
36. Liquidity
âȘ The endpoints are shown to contribute the largest part of the
profits when, in reality, no executions may have been possible
near those levels.
âȘ Assuming the ability to execute at all points on the actual
distribution bbâČ, the approximate profit contribution is shown as
ddâČ.
âȘ For trend-following systems, no profits should be expected
when buy or sell orders are placed at the extremes of the day.
âȘ The actual price distribution bbâČ is the maximum that could be
expected from such a system; in reality, the first day is usually
a loss.
36
37. Liquidity
âȘ The dotted line ccâČ represents the apparent profit for a
countertrend system that makes the assumption of a straight-
line volume distribution.
âȘ The endpoints are shown to contribute the largest part of the
profits when, in reality, no executions may have been possible
near those levels.
âȘ Assuming the ability to execute at all points on the actual
distribution bbâČ, the approximate profit contribution is shown as
ddâČ.
37
39. Neural Networks
âȘ Neural networks are recognized as a powerful tool for uncovering
market relationships.
âȘ The technique offers exceptional ability for discovering nonlinear
relationships between any combination of fundamental
information, technical indicators, and price data.
âȘ The operation of an artificial neural network can be thought of as
a feedback process, similar to the Pavlovian approach to training a
dog:
âȘ 1. A bell rings.
âȘ 2. The dog runs to 1 of 3 bowls.
âȘ 3. If right, the dog gets a treat; if wrong, the dog gets a shock.
âȘ 4. If trained, stop; if not trained, go back to Step 1.
39
40. Neural Networks
âȘ Terminology of Neural Networks
âȘ Neurons are the cells that compose the brain; they process and
store information.
âȘ Networks are groups of neurons.
âȘ Dendrites are receivers of information, passing it directly to the
neurons.
âȘ Axons are pathways that come out of the neuron and allow
information to pass from one neuron to another.
âȘ Synapses exist on the path between neurons and may inhibit or
enhance the flow of information between neurons. They can be
considered selectors.
40
41. Artificial Neural Networks
âȘ Terminology of Neural Networks
âȘ Neurons are the cells that compose the brain; they process and
store information.
âȘ Networks are groups of neurons.
âȘ Dendrites are receivers of information, passing it directly to the
neurons.
âȘ Axons are pathways that come out of the neuron and allow
information to pass from one neuron to another.
âȘ Synapses exist on the path between neurons and may inhibit or
enhance the flow of information between neurons. They can be
considered selectors.
41
42. Artificial Neural Networks
âȘ The human brain works in a way very similar to the artificial neural
network . It groups and weighs the data, combines them into
subgroups, and finally produces a decision.
âȘ The human process of weighing the data is complex and not
necessarily transparent; that is, we may never know the precise
flow of data,
âȘ The weighting factors are found to show that unemployment has
a strong negative effect on prices, the GDP a strong positive
effect, and inventories have a weak positive effect.
âȘ The other items had no consistent predictive ability and received a
weight of zero. This feedback process is called training.
42
44. Selecting and Preprocessing the Inputs
44
âȘ We must decide which factors are most likely affecting the direction of
stocks and the ability to anticipate that direction, then prepare data that
contains information with those qualities
âȘ There are countless factors that might influence the direction of stocks;
the more you choose, the slower the solution and the greater the chance
of a less robust model.
âȘ If you choose too few, they may not contain enough information;
therefore, the preprocessing problem requires practice.
âȘ You may also construct a number of simple trading systems that show
profits and include their basic components as inputs to the neural
network.
âȘ You might create a performance series for a specific system that has only
values â1, 0, and 1, representing short, neutral, and long market
positions.
45. The Training Process
45
âȘ At the heart of the neural network approach is the feedback
process used for training shown in .
âȘ This is the part of neural networks that many people refer to
as the learning process.
âȘ Weighting factors are found using a method called a genetic
algorithm.
âȘ As the training proceeds, these weighting factors are randomly
mutated, or changed, until the best combination is found.
âȘ The genetic algorithm changes and combines weighting
factors in a manner referred to as survival of the fittest, giving
preference to the best and discarding the worst.
46. A Training Example
46
âȘ The five most relevant fundamental factors: GNP, unemployment,
inventories, the U.S. dollar index, and short-term interest rates.
âȘ This test does not use any preprocessed data, such as trends or
indicators. To simplify the process, the following approach is
taken:
âȘ 1. Each input is normalized so that it has values between +100
and â100, indicating strength to weakness, with 0 as neutral.
âȘ 2. When the combined values of the five indicators exceeds +125,
we will enter a long position; when the combined value is below
â125, we will enter a short.
âȘ 3. Values between +125 and â125 are considered neutral to the
trading strategy.
47. Success Criteria
47
âȘ Determining success during the learning process is a matter of
measuring the ANN output against the training data, looking for
convergence.
âȘ The common measurements are the average correlation (adjusted
for the number of parameters), the t-statistic, the t2-statistic, and
the F-statistic.
âȘ The t2-statistic is unique to neural networks and measures the
nonlinear relationships between two variables.
48. Reducing the Number of Decision Levels and Neurons
48
âȘ When there are many decision layers and many neurons, the inputs can
be combined and recombined in many different ways, allowing very
specific patterns to be found.
âȘ The more specific, the greater the chance that the final solution will be
over fitted, that is, it will be fine-tuned to such specific patterns in the
past that those patterns will not occur in the future.
âȘ Neural networks can be highly complex and require experience before
they can be used efficiently.
âȘ Too many inputs and combinations increase the time of testing and
increase the chance of a solution that is over fit.
âȘ Too few values can produce a result that is too general, has large risk,
and is not practical. It is best to begin with the most general and proceed
in clear steps toward a more specific solution.
49. Modeling Human Behavior
49
âȘ Neural networks are considered a learning process similar to the
parallel architecture of the human brain.
âȘ In Early in 2003 as well as after the subprime crisis of 2008,
corporate earnings and income growth were most important.
Other reports were, for the most part, ignored.
âȘ After earnings improved and the stock market had rallied, traders
looked for employment statistics to improve as a means for
sustaining economic growth and the stock market rally.
âȘ At that point, earnings were no longer as important as more jobs.
âȘ A neural network can be constructed to reflect this selection
process that makes one or two economic reports more important
than others given the state of the economy.
50. Genetic Algorithms
50
Representation
of a Genetic
Algorithm
Initial
Chromosome
Pool
Fitness Mutation
Mating Propagation
Converging on a
Solution
Putting It into
Practice:
Simulated
Performance
Multiple Seeding
Replication of
Hedge Funds
51. Genetic Algorithms
âȘ The concept of a genetic algorithm is based on Darwinâs theory of
survival of the fittest.
âȘ In the real world, a mutation with traits that improve any creatureâs
ability to survive will continue to procreate.
âȘ a genetic algorithm is actually a sophisticated search method that
replaces the standard optimization, it uses a technique that parallels the
survival of the fittest.
âȘ Standard statistical criteria are used in the selection process to qualify
the results.
âȘ Searching for a large, optimal set of parameters or finding the best
portfolio allocation takes minutes using a genetic algorithm; a standard
sequential search may take weeks at the same computing speed.
51
52. Representation of a Genetic Algorithm
âȘ The most basic component of a genetic algorithm is a gene; a number of
genes will comprise an individual, and a combination of individuals (and
therefore genes) is a chromosome.
âȘ Chromosome represents a potential solution, a set of trading rules or
parameters where the genes are the specific values and calculations.
These in turn form individuals that represent rules that ultimately form a
trading strategy.
52
53. Representation of a Genetic Algorithm
âȘ Chromosome might be a rule to buy on strength:
âȘ 1. If a 10-day moving average is less than yesterdayâs close and a 5-day
âȘ stochastic is greater than 50, then buy.
âȘ Chromosome 2 could be a rule that buys on weakness:
âȘ 1. If a 20-day exponential is less than yesterdayâs low and a 10-day RSI
is less than 50, then buy.
âȘ If we rewrite these two chromosomes in a notational form, the genes
and individuals in its structure become more apparent:
âȘ 1. Chromosome 1: MA, 10, <, C, [0], &, Stoch, 5, >, 50, 1
âȘ 2. Chromosome 2: Exp, 20, <, L, [1], &, RSI, 10, <, 50, 1
53
54. Representation of a Genetic Algorithm
âȘ For example, the 10- and 20-day averages in gene 2 of chromosomes 1
and 2 could be changed to 5 and 15 days; or, the indicators
âȘ Stoch and RSI could be changed to MACD and Momentum.
âȘ A combination of trading rules, or chromosomes, will create a trading
strategy. Before continuing, the following steps will be needed to use the
genetic algorithm to find the best results:
âȘ 1. A clear way of representing the chromosomes and their component
individuals and genes.
âȘ 2. A fitness criterion to decide that one chromosome is better than another.
âȘ 3. A propagation procedure that determines which chromosomes will survive
and in what manner.
âȘ 4. A process for mutation (introducing new characteristics) and mating
(combining genes) to give chromosomes with greater potential a better
chance for survival.
54
55. Representation of a Genetic Algorithm
Initial Chromosome Pool - Eleven lists are needed, one for each unique gene.
1. Trend type, 1 of 5 choices: a moving average, exponential smoothing, linear regression,
breakout, or step-weighted average.
2. Trend calculation period, a number between 1 and 200.
3. Trend relational operator, 1 of 3 choices: <, <=, >.
4. Price used in the trend calculation, 1 of 4 choices: C, (H + L + C)/3, (H + L)/2, indexed value.
5. Reference data or lag: a number between 1 and 10.
6. Method of combining individuals, 1 of 2 choices: and or or.
7. Indicator type, 1 of 5 choices: RSI, stochastic, MACD, momentum, Fisher transform (all
indicators must be transformed to return values between â100 and +100).
8. Indicator calculation period, a number between 1 and 50.
9. Relational operator, 1 of 2 choices: > or <.
10. Comparison value for indicator and relational operator, a number between â100 and +100.
11. Market action, 1 of 2 choices: buy or sell.
55
56. Representation of a Genetic Algorithm
âȘ Fitness - defining a fitness criterion, or objective function, which can be used to rank
the chromosomes.
âȘ A fitness criterion must combine the most important features associated with a
successful trading strategy:
âȘ Net profits or profits per trade.
âȘ The number of trades or a sample error criterion.
âȘ The smoothness of the results or a reward-to-risk ratio
âȘ To measure that result, the following might be used:
âȘ where
âȘ PPT = the profits per trade
âȘ NT = the number of trades
âȘ GP = the gross profits
âȘ GL = the gross losses
56
57. Representation of a Genetic Algorithm
âȘ Propagation - The process of natural selection allows only the best individuals to
survive.
âȘ A strong propagation criterion is used to encourage the survival of chromosomes with
the highest ranking, as determined by the fitness test.
âȘ When an individual has a high fitness score, it is allowed to create more offspring;
therefore, it becomes a larger part of the population.
âȘ When it has a low score, it creates fewer offspring, or no offspring, and eventually
disappears from the population.
57
58. Fractals, Chaos, and Entropy
58
Chaos theory
Fractal
Dimension
Entropy
Chaotic
Patterns and
Market Behavior
59. Fractals, Chaos, and Entropy
âȘ Chaos theory is a way to describe the complex behavior of
nonlinear systems, those that cannot be described by a straight
line. It is also called nonlinear dynamics.
âȘ One method of measuring chaotic systems is with various
geometric shapes. This effort has resulted in an area of
mathematics now
âȘ In the real world, however, there are no straight lines; if you look
closely enoughâusing a microscope if necessaryâall âstraight
linesâ have ragged edges and all may be described as chaotic
called fractal geometry
59
60. Fractal Dimension
âȘ Fractal dimension is the degree of roughness or irregularity of a
structure or system.
âȘ Using Fractal Efficiency
âȘ The Kaufmanâs Efficiency Ratio is formed by dividing the
absolute value of the net change in price movement over n
periods by the sum of all component moves, taken as positive
numbers, over the same n periods.
âȘ If the ratio approaches the value 1.0 then the movement is
smooth (not chaotic); if the ratio approaches 0, then there is great
inefficiency, chaos, or noise.
âȘ This same measurement has been renamed fractal efficiency.
Kaufman related this to trending and non trending patterns, when
the ratio approached 1.0 and 0, respectively.
60
61. Fractal Dimension
âȘ Each market has its unique underlying level of noise, the
measurement of fractal efficiency should be consistent over all
markets. Markets may vary in volatility although their chaotic
behavior is technically the same
âȘ The interpretation of fractal efficiency as noise allows trading
rules to be developed.
âȘ For example, a market with less noise should be entered quickly
using a trending system, while it would be best to wait for a better
price if the market has been rated as high noise.
âȘ A noisy market is one that continues to change direction, while an
efficient market is smooth. When viewed in the long term, the
level of market noise should determine the type of strategy that
should be applied to each market.
61
62. Chaotic Patterns and Market Behavior
âȘ Each market has its unique
underlying level of noise,
the measurement of fractal
efficiency should be
consistent over all markets.
Markets may vary in
volatility although their
chaotic behavior is
technically the same
62
63. EntropyâPredicting by Similar Situations
âȘ Each market has its unique underlying level of noise, the
measurement of fractal efficiency should be consistent over all
markets. Markets may vary in volatility although their chaotic
behavior is technically the same
âȘ The interpretation of fractal efficiency as noise allows trading
rules to be developed.
âȘ For example, a market with less noise should be entered quickly
using a trending system, while it would be best to wait for a better
price if the market has been rated as high noise.
âȘ A noisy market is one that continues to change direction, while an
efficient market is smooth. When viewed in the long term, the
level of market noise should determine the type of strategy that
should be applied to each market.
63
64. Main Points to Remember
- Higher prices translate into higher volatility
- Linear price scale - equal distance between the prices
- logarithmic price scale - equal percent changes
- Volatility that is the key driver of risk, and measuring
volatility correctly is necessary to control that risk.
- The time period over which volatility is measured is also a
significant factor in the price-volatility relationship.
- Volatility Measures Methods:-
- The change in price over n days
- The maximum price fluctuation during the n days
- The average true range
- The sum of the absolute price changes over n days (Relative
Volatility)
- Classic annualized volatility for daily data (Annualized Volatility)
65. Predicting Volatility with Trading Ranges
65
Thomas
ByronicâOn-
Balance True
Range
VIX Trading
Systems -
Connors
MarketSci Blog
Gerald Appel on
VIX
Fractals, Chaos,
and Entropy
Trends and
Price Noise
Trends and
Interest Rate
Carry
Neural Networks
Modeling
Human
Behavior
Genetic
Algorithms
Liquidity
Trade Selection
Using Volatility
66. Main Points to Remember
- Intraday Volatility and Volume (Options Time & Strategy in Intraday)
- Predicting Volatility with Trading Ranges
- On-Balance True Range by following the same rules as On-Balance
Volume (OBV)
- VIX Trading Systems - Connors
- Market Sci Blog (10SMA & 10 EMA on VIX to Generate Buy & Sell Signal)
- Gerald Appel on VIX (Volatility Breakout â Buy System)
- Trends and Price Noise (Volume)
- The change in price over n days
- The maximum price fluctuation during the n days
- The average true range
- The sum of the absolute price changes over n days (Relative
Volatility)
- Classic annualized volatility for daily data (Annualized Volatility)
- Trade Selection Using Volatility
1 - High Volatility 2 - Eliminate or Delay?(At the time of Entry Signal)
3 - Constructing a Volatility Filter 4 - Standard Deviation Measurement
5 - High Volatility Exits: Reducing the Risk 6 - Ranking Based on Volatility
67. Main Points to Remember
- One Main Reason of disappointing performance is the lack of
understanding of market liquidity at the time of execution.
- Fractals, Chaos, and Entropy
- Chaos theory is a way to describe the complex behavior of nonlinear
systems
- Fractal dimension is the degree of roughness or irregularity of a structure
or system.
- Each market has its unique underlying level of noise, the measurement
of fractal efficiency should be consistent over all markets. Markets may
vary in volatility although their chaotic behavior is technically the same
- EntropyâPredicting by Similar Situations
68. Main Points to Remember
- Neural Networks - Neural networks are recognized as a
powerful tool for uncovering market relationships.
- A genetic algorithm is actually a sophisticated search method
that replaces the standard optimization, it uses a technique that
parallels the survival of the fittest.
- Modeling human behavior is to consider the human as a device
with a large number of internal mental states, each with its own
particular control behavior and interstate transition probabilities.