MACD Enhanced [DCAUT]█ MACD Enhanced Indicator
📊 Indicator Overview
MACD Enhanced is an enhanced version of the traditional MACD indicator. Unlike standard MACD which is limited to EMA algorithm, this enhanced version supports different moving average algorithms for both main MACD line and signal line, allowing traders to optimize MACD calculations based on market conditions and preferences.
🎯 Core Advantages
Difference from Traditional MACD
Traditional MACD : Fixed EMA algorithm for both main and signal lines
MACD Enhanced : Supports different moving average algorithms for main and signal lines
Algorithm Flexibility Benefits
Fast Response : EMA, HMA algorithms respond quickly to price changes
Stable Smoothing : SMA, RMA algorithms provide more stable signals
Adaptive Adjustment : KAMA, FRAMA algorithms automatically adjust based on market conditions
Noise Filtering : Kalman Filter, Super Smoother reduce false signals
🚀 Usage Guide
Parameter Settings
Source : Data source (default: close price)
Fast Length : Fast line period (recommended: 12)
Slow Length : Slow line period (recommended: 26)
Signal Length : Signal line period (recommended: 9)
MACD MA Type : Moving average algorithm for main MACD line (default: EMA)
Signal MA Type : Moving average algorithm for signal line (default: EMA)
Algorithm Combination Guidelines
Fast Trading : Choose EMA, HMA for both main and signal lines
Stable Analysis : Choose SMA, RMA for both main and signal lines
Mixed Strategy : Choose fast algorithm for main line, smooth algorithm for signal line
Trend Following : Choose KAMA for main line, EMA for signal line
📊 Color Coding
Line Colors
Blue : MACD main line
Orange : Signal line
Histogram
Dark Green : Histogram > 0 and rising (strong bullish momentum)
Light Green : Histogram > 0 and falling (weak bullish momentum)
Light Red : Histogram < 0 and rising (weak bearish momentum)
Dark Red : Histogram < 0 and falling (strong bearish momentum)
💡 Core Value
By selecting different moving average algorithm combinations, traders can obtain:
- MACD signals better suited to current market conditions
- Trend tracking capabilities with different sensitivity levels
- Algorithm optimization possibilities that traditional MACD cannot provide
📄 License : MIT License
👨💻 Developer : DCAUT Team
Search in scripts for "algo"
POC Migration Velocity (POC-MV) [PhenLabs]📊POC Migration Velocity (POC-MV)
Version: PineScript™v6
📌Description
The POC Migration Velocity indicator revolutionizes market structure analysis by tracking the movement, speed, and acceleration of Point of Control (POC) levels in real-time. This tool combines sophisticated volume distribution estimation with velocity calculations to reveal hidden market dynamics that conventional indicators miss.
POC-MV provides traders with unprecedented insight into volume-based price movement patterns, enabling the early identification of continuation and exhaustion signals before they become apparent to the broader market. By measuring how quickly and consistently the POC migrates across price levels, traders gain early warning signals for significant market shifts and can position themselves advantageously.
The indicator employs advanced algorithms to estimate intra-bar volume distribution without requiring lower timeframe data, making it accessible across all chart timeframes while maintaining sophisticated analytical capabilities.
🚀Points of Innovation
Micro-POC calculation using advanced OHLC-based volume distribution estimation
Real-time velocity and acceleration tracking normalized by ATR for cross-market consistency
Persistence scoring system that quantifies directional consistency over multiple periods
Multi-signal detection combining continuation patterns, exhaustion signals, and gap alerts
Dynamic color-coded visualization system with intensity-based feedback
Comprehensive customization options for resolution, periods, and thresholds
🔧Core Components
POC Calculation Engine: Estimates volume distribution within each bar using configurable price bands and sophisticated weighting algorithms
Velocity Measurement System: Tracks the rate of POC movement over customizable lookback periods with ATR normalization
Acceleration Calculator: Measures the rate of change of velocity to identify momentum shifts in POC migration
Persistence Analyzer: Quantifies how consistently POC moves in the same direction using exponential weighting
Signal Detection Framework: Combines trend analysis, velocity thresholds, and persistence requirements for signal generation
Visual Rendering System: Provides dynamic color-coded lines and heat ribbons based on velocity and price-POC relationships
🔥Key Features
Real-time POC calculation with 10-100 configurable price bands for optimal precision
Velocity tracking with customizable lookback periods from 5 to 50 bars
Acceleration measurement for detecting momentum changes in POC movement
Persistence scoring to validate signal strength and filter false signals
Dynamic visual feedback with blue/orange color scheme indicating bullish/bearish conditions
Comprehensive alert system for continuation patterns, exhaustion signals, and POC gaps
Adjustable information table displaying real-time metrics and current signals
Heat ribbon visualization showing price-POC relationship intensity
Multiple threshold settings for customizing signal sensitivity
Export capability for use with separate panel indicators
🎨Visualization
POC Connecting Lines: Color-coded lines showing POC levels with intensity based on velocity magnitude
Heat Ribbon: Dynamic colored ribbon around price showing POC-price basis intensity
Signal Markers: Clear exhaustion top/bottom signals with labeled shapes
Information Table: Real-time display of POC value, velocity, acceleration, basis, persistence, and current signal status
Color Gradients: Blue gradients for bullish conditions, orange gradients for bearish conditions
📖Usage Guidelines
POC Calculation Settings
POC Resolution (Price Bands): Default 20, Range 10-100. Controls the number of price bands used to estimate volume distribution within each bar
Volume Weight Factor: Default 0.7, Range 0.1-1.0. Adjusts the influence of volume in POC calculation
POC Smoothing: Default 3, Range 1-10. EMA smoothing period applied to the calculated POC to reduce noise
Velocity Settings
Velocity Lookback Period: Default 14, Range 5-50. Number of bars used to calculate POC velocity
Acceleration Period: Default 7, Range 3-20. Period for calculating POC acceleration
Velocity Significance Threshold: Default 0.5, Range 0.1-2.0. Minimum normalized velocity for continuation signals
Persistence Settings
Persistence Lookback: Default 5, Range 3-20. Number of bars examined for persistence score calculation
Persistence Threshold: Default 0.7, Range 0.5-1.0. Minimum persistence score required for continuation signals
Visual Settings
Show POC Connecting Lines: Toggle display of colored lines connecting POC levels
Show Heat Ribbon: Toggle display of colored ribbon showing POC-price relationship
Ribbon Transparency: Default 70, Range 0-100. Controls transparency level of heat ribbon
Alert Settings
Enable Continuation Alerts: Toggle alerts for continuation pattern detection
Enable Exhaustion Alerts: Toggle alerts for exhaustion pattern detection
Enable POC Gap Alerts: Toggle alerts for significant POC gaps
Gap Threshold: Default 2.0 ATR, Range 0.5-5.0. Minimum gap size to trigger alerts
✅Best Use Cases
Identifying trend continuation opportunities when POC velocity aligns with price direction
Spotting potential reversal points through exhaustion pattern detection
Confirming breakout validity by monitoring POC gap behavior
Adding volume-based context to traditional technical analysis
Managing position sizing based on POC-price basis strength
⚠️Limitations
POC calculations are estimations based on OHLC data, not true tick-by-tick volume distribution
Effectiveness may vary in low-volume or highly volatile market conditions
Requires complementary analysis tools for complete trading decisions
Signal frequency may be lower in ranging markets compared to trending conditions
Performance optimization needed for very short timeframes below 1-minute
💡What Makes This Unique
Advanced Estimation Algorithm: Sophisticated method for calculating POC without requiring lower timeframe data
Velocity-Based Analysis: Focus on POC movement dynamics rather than static levels
Comprehensive Signal Framework: Integration of continuation, exhaustion, and gap detection in one indicator
Dynamic Visual Feedback: Intensity-based color coding that adapts to market conditions
Persistence Validation: Unique scoring system to filter signals based on directional consistency
🔬How It Works
Volume Distribution Estimation:
Divides each bar into configurable price bands for volume analysis
Applies sophisticated weighting based on OHLC relationships and proximity to close
Identifies the price level with maximum estimated volume as the POC
Velocity and Acceleration Calculation:
Measures POC rate of change over specified lookback periods
Normalizes values using ATR for consistent cross-market performance
Calculates acceleration as the rate of change of velocity
Signal Generation Process:
Combines trend direction analysis using EMA crossovers
Applies velocity and persistence thresholds to filter signals
Generates continuation, exhaustion, and gap alerts based on specific criteria
💡Note:
This indicator provides estimated POC calculations based on available OHLC data and should be used in conjunction with other analysis methods. The velocity-based approach offers unique insights into market structure dynamics but requires proper risk management and complementary analysis for optimal trading decisions.
[Pandora][Swarm] Rapid Exponential Moving AverageENVISIONING POSSIBILITY
What is the theoretical pinnacle of possibility? The current state of algorithmic affairs falls far short of my aspirations for achievable feasibility. I'm lifting the lid off of Pandora's box once again, very publicly this time, as a brute force challenge to conventional 'wisdom'. The unfolding series of time mandates a transcendental systemic alteration...
THE MOVING AVERAGE ZOO:
The realm of digital signal processing for trading is filled with familiar antiquated filtering tools. Two families of filtration, being 'infinite impulse response' (EMA, RMA, etc.) and 'finite impulse response' (WMA, SMA, etc.), are prevalently employed without question. These filter types are the mules and donkeys of data analysis, broadly accepted for use in finance.
At first glance, they appear sufficient for most tasks, offering a basic straightforward way to reduce noise and highlight trends. Yet, beneath their simplistic facade lies a constellation of limitations and impediments, each having its own finicky quirks. Upon closer inspection, identifiable drawbacks render them far from ideal for many real-world applications in today's volatile markets.
KNOWN FUNDAMENTAL FLAWS:
Despite commonplace moving average (MA) popularity, these conventional filters suffer from an assortment of fundamental flaws. Most of them don't genuinely address core challenges of how to preserve the true dynamics of a signal while suppressing noise and retaining cutoff frequency compliance. Their simple cookie cutter structures make them ill-suited in actuality for dynamic market environments. In reality, they often trade one problem for another dilemma, forsaking analytics to choose between distortion and delay.
A deeper seeded issue remains within frequency compliance, how adequately a filter respects (or disrespects) the underlying signal’s spectral properties according to it's assigned periodic parameter. Traditional MAs habitually distort phase relationships, causing delayed reactions with surplus lag or exaggerations with excessive undershoot/overshoot. For applications requiring timely resilience, such as algorithmic trading, these shortcomings are often functionally unacceptable. What’s needed is vigorous filters that can more accurately retain signal behaviors while minimizing lag without sacrificing smoothness and uniformity. Until then, the public MA zoo remains as a collection of corny compromises, rather than a favorable toolbelt of solutions.
P.S.: In PSv7+, in my opinion, many of these geriatric MAs deserve no future with ease of access for the naive, simply not knowing these filters are most likely creating bigger problems than solving any.
R.E.M.A.
What is this? I prefer to think of it as the "radical EMA", definitely along my lines of a retire everything morte algorithm. This isn't your run of the mill average from the petting zoo. I would categorize it as a paradigm shifting rampant economic masochistic annihilator, sufficiently good enough to begin ruthlessly executing moving averages left and right. Um, yeah... that kind of moving average destructor as you may soon recognize with a few 'Filters+' settings adjustments, realizing ordinary EMA has been doing us an injustice all this time.
Does it possess the capability to relentlessly exterminate most averaging filters in existence? Well, it's about time we find out, by uncaging it on the loose into the greater economic wilderness. Only then can we truly find out if it is indeed a radical exponential market accelerant whose time has come. If it is, then it may eventually become a reality erasing monolithic anomaly destined for greatness, ultimately changing the entire landscape of trading in perpetuity.
UNLEASHING NEXT-GEN:
This lone next generation exoweapon algorithm is intended to initiate the transformative beginning stages of mass filtration deprecation. However, it won't be the only one, just the first arrival of it's alien kind from me. Welcome to notion #1 of my future filtration frontier, on this episode of the algorithmic twilight zone. Where reality takes a twisting turn one dimension beyond practical logic, after persistent models of mindset disintegrate into insignificance, followed by illusory perception confronted into cognitive dissonance.
An evolutionary path to genuine advancement resides outside the prison of preconceptions, manifesting only after divergence from persistent binding restrictions of dogmatic doctrines. Such a genesis in transformative thinking will catalyze unbounded cognitive potential, plowing the way for the cultivation of total redesigns of thought. Futuristic innovative breakthroughs demand the surrender of legacy and outmoded understandings.
Now that the world's largest assembly of investors has been ensembled, there are additional tasks left to perform. I'm compelled to deploy this mathematical-weapon of mass financial creation into it's rightful destined hands, to "WE THE PEOPLE" of TV.
SCRIPT INTENTION:
Deprecate anything and everything as any non-commercial member sees desirably fit. This includes your existing code formulations already in working functional modes of operation AND/OR future projects in the works. Swapping is nearly as simple as copying and pasting with meager modifications, after you have identified comparable likeness in this indicators settings with a visual assessment. Results may become eye opening, but only if you dare to look and test.
Where you may suspect a ta.filter() is lacking sufficient luster or may be flat out majorly deficient, employing rema, drema, trema, or qrema configurations may be a more suitable replacement. That's up to you to discern. My code satire already identifies likely bottom of the barrel suspects that either belong in the extinction record or have already been marked for deprecation. They are ordered more towards the bottom by rank where they belong. SuperSmoother is a masterpiece here to stay, being my original go-to reference filter. Everything you see here is already deprecated, including REMA...
REMA CHARACTERISTICS
- VERY low lag
- No overshoot
- Frequency compliant
- Proper initialization at bar_index==0
- Period parameter accepts poitive floating point numerics (AND integers!)
- Infinite impulse response (IIR) filter
- Compact code footprint
- Minimized computational overhead
3D Surface Modeling [PhenLabs]📊 3D Surface Modeling
Version: PineScript™ v6
📌 Description
The 3D Surface Modeling indicator revolutionizes technical analysis by generating three-dimensional visualizations of multiple technical indicators across various timeframes. This advanced analytical tool processes and renders complex indicator data through a sophisticated matrix-based calculation system, creating an intuitive 3D surface representation of market dynamics.
The indicator employs array-based computations to simultaneously analyze multiple instances of selected technical indicators, mapping their behavior patterns across different temporal dimensions. This unique approach enables traders to identify complex market patterns and relationships that may be invisible in traditional 2D charts.
🚀 Points of Innovation
Matrix-Based Computation Engine: Processes up to 500 concurrent indicator calculations in real-time
Dynamic 3D Rendering System: Creates depth perception through sophisticated line arrays and color gradients
Multi-Indicator Integration: Seamlessly combines VWAP, Hurst, RSI, Stochastic, CCI, MFI, and Fractal Dimension analyses
Adaptive Scaling Algorithm: Automatically adjusts visualization parameters based on indicator type and market conditions
🔧 Core Components
Indicator Processing Module: Handles real-time calculation of multiple technical indicators using array-based mathematics
3D Visualization Engine: Converts indicator data into three-dimensional surfaces using line arrays and color mapping
Dynamic Scaling System: Implements custom normalization algorithms for different indicator types
Color Gradient Generator: Creates depth perception through programmatic color transitions
🔥 Key Features
Multi-Indicator Support: Comprehensive analysis across seven different technical indicators
Customizable Visualization: User-defined color schemes and line width parameters
Real-time Processing: Continuous calculation and rendering of 3D surfaces
Cross-Timeframe Analysis: Simultaneous visualization of indicator behavior across multiple periods
🎨 Visualization
Surface Plot: Three-dimensional representation using up to 500 lines with dynamic color gradients
Depth Indicators: Color intensity variations showing indicator value magnitude
Pattern Recognition: Visual identification of market structures across multiple timeframes
📖 Usage Guidelines
Indicator Selection
Type: VWAP, Hurst, RSI, Stochastic, CCI, MFI, Fractal Dimension
Default: VWAP
Starting Length: Minimum 5 periods
Default: 10
Step Size: Interval between calculations
Range: 1-10
Visualization Parameters
Color Scheme: Green, Red, Blue options
Line Width: 1-5 pixels
Surface Resolution: Up to 500 lines
✅ Best Use Cases
Multi-timeframe market analysis
Pattern recognition across different technical indicators
Trend strength assessment through 3D visualization
Market behavior study across multiple periods
⚠️ Limitations
High computational resource requirements
Maximum 500 line restriction
Requires substantial historical data
Complex visualization learning curve
🔬 How It Works
1. Data Processing:
Calculates selected indicator values across multiple timeframes
Stores results in multi-dimensional arrays
Applies custom scaling algorithms
2. Visualization Generation:
Creates line arrays for 3D surface representation
Applies color gradients based on value magnitude
Renders real-time updates to surface plot
3. Display Integration:
Synchronizes with chart timeframe
Updates surface plot dynamically
Maintains visual consistency across updates
🌟 Credits:
Inspired by LonesomeTheBlue (modified for multiple indicator types with scaling fixes and additional unique mappings)
💡 Note:
Optimal performance requires sufficient computing resources and historical data. Users should start with default settings and gradually adjust parameters based on their analysis requirements and system capabilities.
ICT Opening Range Projections (tristanlee85)ICT Opening Range Projections
This indicator visualizes key price levels based on ICT's (Inner Circle Trader) "Opening Range" concept. This 30-minute time interval establishes price levels that the algorithm will refer to throughout the session. The indicator displays these levels, including standard deviation projections, internal subdivisions (quadrants), and the opening price.
🟪 What It Does
The Opening Range is a crucial 30-minute window where market algorithms establish significant price levels. ICT theory suggests this range forms the basis for daily price movement.
This script helps you:
Mark the high, low, and opening price of each session.
Divide the range into quadrants (premium, discount, and midpoint/Consequent Encroachment).
Project potential price targets beyond the range using configurable standard deviation multiples .
🟪 How to Use It
This tool aids in time-based technical analysis rooted in ICT's Opening Range model, helping you observe price interaction with algorithmic levels.
Example uses include:
Identifying early structural boundaries.
Observing price behavior within premium/discount zones.
Visualizing initial displacement from the range to anticipate future moves.
Comparing price reactions at projected standard deviation levels.
Aligning price action with significant times like London or NY Open.
Note: This indicator provides a visual framework; it does not offer trade signals or interpretations.
🟪 Key Information
Time Zone: New York time (ET) is required on your chart.
Sessions: Supports multiple sessions, including NY midnight, NY AM, NY PM, and three custom timeframes.
Time Interval: Supports multi-timeframe up to 15 minutes. Best used on a 1-minute chart for accuracy.
🟪 Session Options
The Opening Range interval is configurable for up to 6 sessions:
Pre-defined ICT Sessions:
NY Midnight: 12:00 AM – 12:30 AM ET
NY AM: 9:30 AM – 10:00 AM ET
NY PM: 1:30 PM – 2:00 PM ET
Custom Sessions:
Three user-defined start/end time pairs.
This example shows a custom session from 03:30 - 04:00:
🟪 Understanding the Levels
The Opening Price is the open of the first 1-minute candle within the chosen session.
At session close, the Opening Range is calculated using its High and Low . An optional swing-based mode uses swing highs/lows for range boundaries.
The range is divided into quadrants by its midpoint ( Consequent Encroachment or CE):
Upper Quadrant: CE to high (premium).
Lower Quadrant: Low to CE (discount).
These subdivisions help visualize internal range dynamics, where price often reacts during algorithmic delivery.
🟪 Working with Ranges
By default, the range is determined by the highest high and lowest low of the 30-minute session:
A range can also be determined by the highest/lowest swing points:
Quadrants outline the premium and discount of a range that price will reference:
Small ranges still follow the same algorithmic logic, but may be deemed insignificant for one's trading. These can be filtered in the settings by specifying a minimum ticks limit. In this example, the range is 42 ticks (10.5 points) but the indicator is configured for 80 ticks (20 points). We can select which levels will plot if the range is below the limit. Here, only the 00:00 opening price is plotted:
You may opt to include the range high/low, quadrants, and projections as well. This will plot a red (configurable) range bracket to indicate it is below the limit while plotting the levels:
🟪 Price Projections
Projections extend beyond the Opening Range using standard deviations, framing the market beyond the initial session and identifying potential targets. You define the standard deviation multiples (e.g., 1.0, 1.5, 2.0).
Both positive and negative extensions are displayed, symmetrically projected from the range's high and low.
The Dynamic Levels option plots only the next projection level once price crosses the previous extreme. For example, only the 0.5 STDEV level plots until price reaches it, then the 1.0 level appears, and so on. This continues up to your defined maximum projections, or indefinitely if standard deviations are set to 0.
This example shows dynamic levels for a total of 6 sessions, only 1 of which meet a configured minimum limit of 50 ticks:
Small ranges followed by significant displacement are impacted the most with the number of levels plotted. You may hide projections when configuring the minimum ticks.
A fixed standard deviation will plot levels in both directions, regardless of the price range. Here, we plot up to 3.0 which hiding projections for small ranges:
🟪 Legal Disclaimer
This indicator is provided for informational and educational purposes only. It is not financial advice, and should not be construed as a recommendation to buy or sell any financial instrument. Trading involves substantial risk, and you could lose a significant amount of money. Past performance is not indicative of future results. Always consult with a qualified financial professional before making any trading or investment decisions. The creators and distributors of this indicator assume no responsibility for your trading outcomes.
Momentum + Keltner Stochastic Combo)The Momentum-Keltner-Stochastic Combination Strategy: A Technical Analysis and Empirical Validation
This study presents an advanced algorithmic trading strategy that implements a hybrid approach between momentum-based price dynamics and relative positioning within a volatility-adjusted Keltner Channel framework. The strategy utilizes an innovative "Keltner Stochastic" concept as its primary decision-making factor for market entries and exits, while implementing a dynamic capital allocation model with risk-based stop-loss mechanisms. Empirical testing demonstrates the strategy's potential for generating alpha in various market conditions through the combination of trend-following momentum principles and mean-reversion elements within defined volatility thresholds.
1. Introduction
Financial market trading increasingly relies on the integration of various technical indicators for identifying optimal trading opportunities (Lo et al., 2000). While individual indicators are often compromised by market noise, combinations of complementary approaches have shown superior performance in detecting significant market movements (Murphy, 1999; Kaufman, 2013). This research introduces a novel algorithmic strategy that synthesizes momentum principles with volatility-adjusted envelope analysis through Keltner Channels.
2. Theoretical Foundation
2.1 Momentum Component
The momentum component of the strategy builds upon the seminal work of Jegadeesh and Titman (1993), who demonstrated that stocks which performed well (poorly) over a 3 to 12-month period continue to perform well (poorly) over subsequent months. As Moskowitz et al. (2012) further established, this time-series momentum effect persists across various asset classes and time frames. The present strategy implements a short-term momentum lookback period (7 bars) to identify the prevailing price direction, consistent with findings by Chan et al. (2000) that shorter-term momentum signals can be effective in algorithmic trading systems.
2.2 Keltner Channels
Keltner Channels, as formalized by Chester Keltner (1960) and later modified by Linda Bradford Raschke, represent a volatility-based envelope system that plots bands at a specified distance from a central exponential moving average (Keltner, 1960; Raschke & Connors, 1996). Unlike traditional Bollinger Bands that use standard deviation, Keltner Channels typically employ Average True Range (ATR) to establish the bands' distance from the central line, providing a smoother volatility measure as established by Wilder (1978).
2.3 Stochastic Oscillator Principles
The strategy incorporates a modified stochastic oscillator approach, conceptually similar to Lane's Stochastic (Lane, 1984), but applied to a price's position within Keltner Channels rather than standard price ranges. This creates what we term "Keltner Stochastic," measuring the relative position of price within the volatility-adjusted channel as a percentage value.
3. Strategy Methodology
3.1 Entry and Exit Conditions
The strategy employs a contrarian approach within the channel framework:
Long Entry Condition:
Close price > Close price periods ago (momentum filter)
KeltnerStochastic < threshold (oversold within channel)
Short Entry Condition:
Close price < Close price periods ago (momentum filter)
KeltnerStochastic > threshold (overbought within channel)
Exit Conditions:
Exit long positions when KeltnerStochastic > threshold
Exit short positions when KeltnerStochastic < threshold
This methodology aligns with research by Brock et al. (1992) on the effectiveness of trading range breakouts with confirmation filters.
3.2 Risk Management
Stop-loss mechanisms are implemented using fixed price movements (1185 index points), providing definitive risk boundaries per trade. This approach is consistent with findings by Sweeney (1988) that fixed stop-loss systems can enhance risk-adjusted returns when properly calibrated.
3.3 Dynamic Position Sizing
The strategy implements an equity-based position sizing algorithm that increases or decreases contract size based on cumulative performance:
$ContractSize = \min(baseContracts + \lfloor\frac{\max(profitLoss, 0)}{equityStep}\rfloor - \lfloor\frac{|\min(profitLoss, 0)|}{equityStep}\rfloor, maxContracts)$
This adaptive approach follows modern portfolio theory principles (Markowitz, 1952) and Kelly criterion concepts (Kelly, 1956), scaling exposure proportionally to account equity.
4. Empirical Performance Analysis
Using historical data across multiple market regimes, the strategy demonstrates several key performance characteristics:
Enhanced performance during trending markets with moderate volatility
Reduced drawdowns during choppy market conditions through the dual-filter approach
Optimal performance when the threshold parameter is calibrated to market-specific characteristics (Pardo, 2008)
5. Strategy Limitations and Future Research
While effective in many market conditions, this strategy faces challenges during:
Rapid volatility expansion events where stop-loss mechanisms may be inadequate
Prolonged sideways markets with insufficient momentum
Markets with structural changes in volatility profiles
Future research should explore:
Adaptive threshold parameters based on regime detection
Integration with additional confirmatory indicators
Machine learning approaches to optimize parameter selection across different market environments (Cavalcante et al., 2016)
References
Brock, W., Lakonishok, J., & LeBaron, B. (1992). Simple technical trading rules and the stochastic properties of stock returns. The Journal of Finance, 47(5), 1731-1764.
Cavalcante, R. C., Brasileiro, R. C., Souza, V. L., Nobrega, J. P., & Oliveira, A. L. (2016). Computational intelligence and financial markets: A survey and future directions. Expert Systems with Applications, 55, 194-211.
Chan, L. K. C., Jegadeesh, N., & Lakonishok, J. (2000). Momentum strategies. The Journal of Finance, 51(5), 1681-1713.
Jegadeesh, N., & Titman, S. (1993). Returns to buying winners and selling losers: Implications for stock market efficiency. The Journal of Finance, 48(1), 65-91.
Kaufman, P. J. (2013). Trading systems and methods (5th ed.). John Wiley & Sons.
Kelly, J. L. (1956). A new interpretation of information rate. The Bell System Technical Journal, 35(4), 917-926.
Keltner, C. W. (1960). How to make money in commodities. The Keltner Statistical Service.
Lane, G. C. (1984). Lane's stochastics. Technical Analysis of Stocks & Commodities, 2(3), 87-90.
Lo, A. W., Mamaysky, H., & Wang, J. (2000). Foundations of technical analysis: Computational algorithms, statistical inference, and empirical implementation. The Journal of Finance, 55(4), 1705-1765.
Markowitz, H. (1952). Portfolio selection. The Journal of Finance, 7(1), 77-91.
Moskowitz, T. J., Ooi, Y. H., & Pedersen, L. H. (2012). Time series momentum. Journal of Financial Economics, 104(2), 228-250.
Murphy, J. J. (1999). Technical analysis of the financial markets: A comprehensive guide to trading methods and applications. New York Institute of Finance.
Pardo, R. (2008). The evaluation and optimization of trading strategies (2nd ed.). John Wiley & Sons.
Raschke, L. B., & Connors, L. A. (1996). Street smarts: High probability short-term trading strategies. M. Gordon Publishing Group.
Sweeney, R. J. (1988). Some new filter rule tests: Methods and results. Journal of Financial and Quantitative Analysis, 23(3), 285-300.
Wilder, J. W. (1978). New concepts in technical trading systems. Trend Research.
Prediction Based on Linreg & Atr
We created this algorithm with the goal of predicting future prices 📊, specifically where the value of any asset will go in the next 20 periods ⏳. It uses linear regression based on past prices, calculating a slope and an intercept to forecast future behavior 🔮. This prediction is then adjusted according to market volatility, measured by the ATR 📉, and the direction of trend signals, which are based on the MACD and moving averages 📈.
How Does the Linreg & ATR Prediction Work?
1. Trend Calculation and Signals:
o Technical Indicators: We use short- and long-term exponential moving averages (EMA), RSI, MACD, and Bollinger Bands 📊 to assess market direction and sentiment (not visually presented in the script).
o Calculation Functions: These include functions to calculate slope, average, intercept, standard deviation, and Pearson's R, which are crucial for regression analysis 📉.
2. Predicting Future Prices:
o Linear Regression: The algorithm calculates the slope, average, and intercept of past prices to create a regression channel 📈, helping to predict the range of future prices 🔮.
o Standard Deviation and Pearson's R: These metrics determine the strength of the regression 🔍.
3. Adjusting the Prediction:
o The predicted value is adjusted by considering market volatility (ATR 📉) and the direction of trend signals 🔮, ensuring that the prediction is aligned with the current market environment 🌍.
4. Visualization:
o Prediction Lines and Bands: The algorithm plots lines that display the predicted future price along with a prediction range (upper and lower bounds) 📉📈.
5. EMA Cross Signals:
o EMA Conditions and Total Score: A bullish crossover signal is generated when the total score is positive and the short EMA crosses above the long EMA 📈. A bearish crossover signal is generated when the total score is negative and the short EMA crosses below the long EMA 📉.
6. Additional Considerations:
o Multi-Timeframe Regression Channel: The script calculates regression channels for different timeframes (5m, 15m, 30m, 4h) ⏳, helping determine the overall market direction 📊 (not visually presented).
Confidence Interpretation:
• High Confidence (close to 100%): Indicates strong alignment between timeframes with a clear trend (bullish or bearish) 🔥.
• Low Confidence (close to 0%): Shows disagreement or weak signals between timeframes ⚠️.
Confidence complements the interpretation of the prediction range and expected direction 🔮, aiding in decision-making for market entry or exit 🚀.
Español
Creamos este algoritmo con el objetivo de predecir los precios futuros 📊, específicamente hacia dónde irá el valor de cualquier activo en los próximos 20 períodos ⏳. Utiliza regresión lineal basada en los precios pasados, calculando una pendiente y una intersección para prever el comportamiento futuro 🔮. Esta predicción se ajusta según la volatilidad del mercado, medida por el ATR 📉, y la dirección de las señales de tendencia, que se basan en el MACD y las medias móviles 📈.
¿Cómo Funciona la Predicción con Linreg & ATR?
Cálculo de Tendencias y Señales:
Indicadores Técnicos: Usamos medias móviles exponenciales (EMA) a corto y largo plazo, RSI, MACD y Bandas de Bollinger 📊 para evaluar la dirección y el sentimiento del mercado (no presentados visualmente en el script).
Funciones de Cálculo: Incluye funciones para calcular pendiente, media, intersección, desviación estándar y el coeficiente de correlación de Pearson, esenciales para el análisis de regresión 📉.
Predicción de Precios Futuros:
Regresión Lineal: El algoritmo calcula la pendiente, la media y la intersección de los precios pasados para crear un canal de regresión 📈, ayudando a predecir el rango de precios futuros 🔮.
Desviación Estándar y Pearson's R: Estas métricas determinan la fuerza de la regresión 🔍.
Ajuste de la Predicción:
El valor predicho se ajusta considerando la volatilidad del mercado (ATR 📉) y la dirección de las señales de tendencia 🔮, asegurando que la predicción esté alineada con el entorno actual del mercado 🌍.
Visualización:
Líneas y Bandas de Predicción: El algoritmo traza líneas que muestran el precio futuro predicho, junto con un rango de predicción (límites superior e inferior) 📉📈.
Señales de Cruce de EMAs:
Condiciones de EMAs y Puntaje Total: Se genera una señal de cruce alcista cuando el puntaje total es positivo y la EMA corta cruza por encima de la EMA larga 📈. Se genera una señal de cruce bajista cuando el puntaje total es negativo y la EMA corta cruza por debajo de la EMA larga 📉.
Consideraciones Adicionales:
Canal de Regresión Multi-Timeframe: El script calcula canales de regresión para diferentes marcos de tiempo (5m, 15m, 30m, 4h) ⏳, ayudando a determinar la dirección general del mercado 📊 (no presentado visualmente).
Interpretación de la Confianza:
Alta Confianza (cerca del 100%): Indica una fuerte alineación entre los marcos temporales con una tendencia clara (alcista o bajista) 🔥.
Baja Confianza (cerca del 0%): Muestra desacuerdo o señales débiles entre los marcos temporales ⚠️.
La confianza complementa la interpretación del rango de predicción y la dirección esperada 🔮, ayudando en las decisiones de entrada o salida en el mercado 🚀.
RSI with Swing Trade by Kelvin_VAlgorithm Description: "RSI with Swing Trade by Kelvin_V"
1. Introduction:
This algorithm uses the RSI (Relative Strength Index) and optional Moving Averages (MA) to detect potential uptrends and downtrends in the market. The key feature of this script is that it visually changes the candle colors based on the market conditions, making it easier for users to identify potential trend swings or wave patterns.
The strategy offers flexibility by allowing users to enable or disable the MA condition. When the MA condition is enabled, the strategy will confirm trends using two moving averages. When disabled, the strategy will only use RSI to detect potential market swings.
2. Key Features of the Algorithm:
RSI (Relative Strength Index):
The RSI is used to identify potential market turning points based on overbought and oversold conditions.
When the RSI exceeds a predefined upper threshold (e.g., 60), it suggests a potential uptrend.
When the RSI drops below a lower threshold (e.g., 40), it suggests a potential downtrend.
Moving Averages (MA) - Optional:
Two Moving Averages (Short MA and Long MA) are used to confirm trends.
If the Short MA crosses above the Long MA, it indicates an uptrend.
If the Short MA crosses below the Long MA, it indicates a downtrend.
Users have the option to enable or disable this MA condition.
Visual Candle Coloring:
Green candles represent a potential uptrend, indicating a bullish move based on RSI (and MA if enabled).
Red candles represent a potential downtrend, indicating a bearish move based on RSI (and MA if enabled).
3. How the Algorithm Works:
RSI Levels:
The user can set RSI upper and lower bands to represent potential overbought and oversold levels. For example:
RSI > 60: Indicates a potential uptrend (bullish move).
RSI < 40: Indicates a potential downtrend (bearish move).
Optional MA Condition:
The algorithm also allows the user to apply the MA condition to further confirm the trend:
Short MA > Long MA: Confirms an uptrend, reinforcing a bullish signal.
Short MA < Long MA: Confirms a downtrend, reinforcing a bearish signal.
This condition can be disabled, allowing the user to focus solely on RSI signals if desired.
Swing Trade Logic:
Uptrend: If the RSI exceeds the upper threshold (e.g., 60) and (optionally) the Short MA is above the Long MA, the candles will turn green to signal a potential uptrend.
Downtrend: If the RSI falls below the lower threshold (e.g., 40) and (optionally) the Short MA is below the Long MA, the candles will turn red to signal a potential downtrend.
Visual Representation:
The candle colors change dynamically based on the RSI values and moving average conditions, making it easier for traders to visually identify potential trend swings or wave patterns without relying on complex chart analysis.
4. User Customization:
The algorithm provides multiple customization options:
RSI Length: Users can adjust the period for RSI calculation (default is 4).
RSI Upper Band (Potential Uptrend): Users can customize the upper RSI level (default is 60) to indicate a potential bullish move.
RSI Lower Band (Potential Downtrend): Users can customize the lower RSI level (default is 40) to indicate a potential bearish move.
MA Type: Users can choose between SMA (Simple Moving Average) and EMA (Exponential Moving Average) for moving average calculations.
Enable/Disable MA Condition: Users can toggle the MA condition on or off, depending on whether they want to add moving averages to the trend confirmation process.
5. Benefits of the Algorithm:
Easy Identification of Trends: By changing candle colors based on RSI and MA conditions, the algorithm makes it easy for users to visually detect potential trend reversals and trend swings.
Flexible Conditions: The user has full control over the RSI and MA settings, allowing them to adapt the strategy to different market conditions and timeframes.
Clear Visualization: With the candle color changes, users can quickly recognize when a potential uptrend or downtrend is forming, enabling faster decision-making in their trading.
6. Example Usage:
Day traders: Can apply this strategy on short timeframes such as 5 minutes or 15 minutes to detect quick trends or reversals.
Swing traders: Can use this strategy on longer timeframes like 1 hour or 4 hours to identify and follow larger market swings.
Intramarket Difference Index StrategyHi Traders !!
The IDI Strategy:
In layman’s terms this strategy compares two indicators across markets and exploits their differences.
note: it is best the two markets are correlated as then we know we are trading a short to long term deviation from both markets' general trend with the assumption both markets will trend again sometime in the future thereby exhausting our trading opportunity.
📍 Import Notes:
This Strategy calculates trade position size independently (i.e. risk per trade is controlled in the user inputs tab), this means that the ‘Order size’ input in the ‘Properties’ tab will have no effect on the strategy. Why ? because this allows us to define custom position size algorithms which we can use to improve our risk management and equity growth over time. Here we have the option to have fixed quantity or fixed percentage of equity ATR (Average True Range) based stops in addition to the turtle trading position size algorithm.
‘Pyramiding’ does not work for this strategy’, similar to the order size input togeling this input will have no effect on the strategy as the strategy explicitly defines the maximum order size to be 1.
This strategy is not perfect, and as of writing of this post I have not traded this algo.
Always take your time to backtests and debug the strategy.
🔷 The IDI Strategy:
By default this strategy pulls data from your current TV chart and then compares it to the base market, be default BINANCE:BTCUSD . The strategy pulls SMA and RSI data from either market (we call this the difference data), standardizes the data (solving the different unit problem across markets) such that it is comparable and then differentiates the data, calling the result of this transformation and difference the Intramarket Difference (ID). The formula for the the ID is
ID = market1_diff_data - market2_diff_data (1)
Where
market(i)_diff_data = diff_data / ATR(j)_market(i)^0.5,
where i = {1, 2} and j = the natural numbers excluding 0
Formula (1) interpretation is the following
When ID > 0: this means the current market outperforms the base market
When ID = 0: Markets are at long run equilibrium
When ID < 0: this means the current market underperforms the base market
To form the strategy we define one of two strategy type’s which are Trend and Mean Revesion respectively.
🔸 Trend Case:
Given the ‘‘Strategy Type’’ is equal to TREND we define a threshold for which if the ID crosses over we go long and if the ID crosses under the negative of the threshold we go short.
The motivating idea is that the ID is an indicator of the two symbols being out of sync, and given we know volatility clustering, momentum and mean reversion of anomalies to be a stylised fact of financial data we can construct a trading premise. Let's first talk more about this premise.
For some markets (cryptocurrency markets - synthetic symbols in TV) the stylised fact of momentum is true, this means that higher momentum is followed by higher momentum, and given we know momentum to be a vector quantity (with magnitude and direction) this momentum can be both positive and negative i.e. when the ID crosses above some threshold we make an assumption it will continue in that direction for some time before executing back to its long run equilibrium of 0 which is a reasonable assumption to make if the market are correlated. For example for the BTCUSD - ETHUSD pair, if the ID > +threshold (inputs for MA and RSI based ID thresholds are found under the ‘‘INTRAMARKET DIFFERENCE INDEX’’ group’), ETHUSD outperforms BTCUSD, we assume the momentum to continue so we go long ETHUSD.
In the standard case we would exit the market when the IDI returns to its long run equilibrium of 0 (for the positive case the ID may return to 0 because ETH’s difference data may have decreased or BTC’s difference data may have increased). However in this strategy we will not define this as our exit condition, why ?
This is because we want to ‘‘let our winners run’’, to achieve this we define a trailing Donchian Channel stop loss (along with a fixed ATR based stop as our volatility proxy). If we were too use the 0 exit the strategy may print a buy signal (ID > +threshold in the simple case, market regimes may be used), return to 0 and then print another buy signal, and this process can loop may times, this high trade frequency means we fail capture the entire market move lowering our profit, furthermore on lower time frames this high trade frequencies mean we pay more transaction costs (due to price slippage, commission and big-ask spread) which means less profit.
By capturing the sum of many momentum moves we are essentially following the trend hence the trend following strategy type.
Here we also print the IDI (with default strategy settings with the MA difference type), we can see that by letting our winners run we may catch many valid momentum moves, that results in a larger final pnl that if we would otherwise exit based on the equilibrium condition(Valid trades are denoted by solid green and red arrows respectively and all other valid trades which occur within the original signal are light green and red small arrows).
another example...
Note: if you would like to plot the IDI separately copy and paste the following code in a new Pine Script indicator template.
indicator("IDI")
// INTRAMARKET INDEX
var string g_idi = "intramarket diffirence index"
ui_index_1 = input.symbol("BINANCE:BTCUSD", title = "Base market", group = g_idi)
// ui_index_2 = input.symbol("BINANCE:ETHUSD", title = "Quote Market", group = g_idi)
type = input.string("MA", title = "Differrencing Series", options = , group = g_idi)
ui_ma_lkb = input.int(24, title = "lookback of ma and volatility scaling constant", group = g_idi)
ui_rsi_lkb = input.int(14, title = "Lookback of RSI", group = g_idi)
ui_atr_lkb = input.int(300, title = "ATR lookback - Normalising value", group = g_idi)
ui_ma_threshold = input.float(5, title = "Threshold of Upward/Downward Trend (MA)", group = g_idi)
ui_rsi_threshold = input.float(20, title = "Threshold of Upward/Downward Trend (RSI)", group = g_idi)
//>>+----------------------------------------------------------------+}
// CUSTOM FUNCTIONS |
//<<+----------------------------------------------------------------+{
// construct UDT (User defined type) containing the IDI (Intramarket Difference Index) source values
// UDT will hold many variables / functions grouped under the UDT
type functions
float Close // close price
float ma // ma of symbol
float rsi // rsi of the asset
float atr // atr of the asset
// the security data
getUDTdata(symbol, malookback, rsilookback, atrlookback) =>
indexHighTF = barstate.isrealtime ? 1 : 0
= request.security(symbol, timeframe = timeframe.period,
expression = [close , // Instentiate UDT variables
ta.sma(close, malookback) ,
ta.rsi(close, rsilookback) ,
ta.atr(atrlookback) ])
data = functions.new(close_, ma_, rsi_, atr_)
data
// Intramerket Difference Index
idi(type, symbol1, malookback, rsilookback, atrlookback, mathreshold, rsithreshold) =>
threshold = float(na)
index1 = getUDTdata(symbol1, malookback, rsilookback, atrlookback)
index2 = getUDTdata(syminfo.tickerid, malookback, rsilookback, atrlookback)
// declare difference variables for both base and quote symbols, conditional on which difference type is selected
var diffindex1 = 0.0, var diffindex2 = 0.0,
// declare Intramarket Difference Index based on series type, note
// if > 0, index 2 outpreforms index 1, buy index 2 (momentum based) until equalibrium
// if < 0, index 2 underpreforms index 1, sell index 1 (momentum based) until equalibrium
// for idi to be valid both series must be stationary and normalised so both series hae he same scale
intramarket_difference = 0.0
if type == "MA"
threshold := mathreshold
diffindex1 := (index1.Close - index1.ma) / math.pow(index1.atr*malookback, 0.5)
diffindex2 := (index2.Close - index2.ma) / math.pow(index2.atr*malookback, 0.5)
intramarket_difference := diffindex2 - diffindex1
else if type == "RSI"
threshold := rsilookback
diffindex1 := index1.rsi
diffindex2 := index2.rsi
intramarket_difference := diffindex2 - diffindex1
//>>+----------------------------------------------------------------+}
// STRATEGY FUNCTIONS CALLS |
//<<+----------------------------------------------------------------+{
// plot the intramarket difference
= idi(type,
ui_index_1,
ui_ma_lkb,
ui_rsi_lkb,
ui_atr_lkb,
ui_ma_threshold,
ui_rsi_threshold)
//>>+----------------------------------------------------------------+}
plot(intramarket_difference, color = color.orange)
hline(type == "MA" ? ui_ma_threshold : ui_rsi_threshold, color = color.green)
hline(type == "MA" ? -ui_ma_threshold : -ui_rsi_threshold, color = color.red)
hline(0)
Note it is possible that after printing a buy the strategy then prints many sell signals before returning to a buy, which again has the same implication (less profit. Potentially because we exit early only for price to continue upwards hence missing the larger "trend"). The image below showcases this cenario and again, by allowing our winner to run we may capture more profit (theoretically).
This should be clear...
🔸 Mean Reversion Case:
We stated prior that mean reversion of anomalies is an standerdies fact of financial data, how can we exploit this ?
We exploit this by normalizing the ID by applying the Ehlers fisher transformation. The transformed data is then assumed to be approximately normally distributed. To form the strategy we employ the same logic as for the z score, if the FT normalized ID > 2.5 (< -2.5) we buy (short). Our exit conditions remain unchanged (fixed ATR stop and trailing Donchian Trailing stop)
🔷 Position Sizing:
If ‘‘Fixed Risk From Initial Balance’’ is toggled true this means we risk a fixed percentage of our initial balance, if false we risk a fixed percentage of our equity (current balance).
Note we also employ a volatility adjusted position sizing formula, the turtle training method which is defined as follows.
Turtle position size = (1/ r * ATR * DV) * C
Where,
r = risk factor coefficient (default is 20)
ATR(j) = risk proxy, over j times steps
DV = Dollar Volatility, where DV = (1/Asset Price) * Capital at Risk
🔷 Risk Management:
Correct money management means we can limit risk and increase reward (theoretically). Here we employ
Max loss and gain per day
Max loss per trade
Max number of consecutive losing trades until trade skip
To read more see the tooltips (info circle).
🔷 Take Profit:
By defualt the script uses a Donchain Channel as a trailing stop and take profit, In addition to this the script defines a fixed ATR stop losses (by defualt, this covers cases where the DC range may be to wide making a fixed ATR stop usefull), ATR take profits however are defined but optional.
ATR SL and TP defined for all trades
🔷 Hurst Regime (Regime Filter):
The Hurst Exponent (H) aims to segment the market into three different states, Trending (H > 0.5), Random Geometric Brownian Motion (H = 0.5) and Mean Reverting / Contrarian (H < 0.5). In my interpretation this can be used as a trend filter that eliminates market noise.
We utilize the trending and mean reverting based states, as extra conditions required for valid trades for both strategy types respectively, in the process increasing our trade entry quality.
🔷 Example model Architecture:
Here is an example of one configuration of this strategy, combining all aspects discussed in this post.
Future Updates
- Automation integration (next update)
AI SuperTrend x Pivot Percentile - Strategy [PresentTrading]█ Introduction and How it is Different
The AI SuperTrend x Pivot Percentile strategy is a sophisticated trading approach that integrates AI-driven analysis with traditional technical indicators. Combining the AI SuperTrend with the Pivot Percentile strategy highlights several key advantages:
1. Enhanced Accuracy in Trend Prediction: The AI SuperTrend utilizes K-Nearest Neighbors (KNN) algorithm for trend prediction, improving accuracy by considering historical data patterns. This is complemented by the Pivot Percentile analysis which provides additional context on trend strength.
2. Comprehensive Market Analysis: The integration offers a multi-faceted approach to market analysis, combining AI insights with traditional technical indicators. This dual approach captures a broader range of market dynamics.
BTC 6H L/S Performance
Local
█ Strategy: How it Works - Detailed Explanation
🔶 AI-Enhanced SuperTrend Indicators
1. SuperTrend Calculation:
- The SuperTrend indicator is calculated using a moving average and the Average True Range (ATR). The basic formula is:
- Upper Band = Moving Average + (Multiplier × ATR)
- Lower Band = Moving Average - (Multiplier × ATR)
- The moving average type (SMA, EMA, WMA, RMA, VWMA) and the length of the moving average and ATR are adjustable parameters.
- The direction of the trend is determined based on the position of the closing price in relation to these bands.
2. AI Integration with K-Nearest Neighbors (KNN):
- The KNN algorithm is applied to predict trend direction. It uses historical price data and SuperTrend values to classify the current trend as bullish or bearish.
- The algorithm calculates the 'distance' between the current data point and historical points. The 'k' nearest data points (neighbors) are identified based on this distance.
- A weighted average of these neighbors' trends (bullish or bearish) is calculated to predict the current trend.
For more please check: Multi-TF AI SuperTrend with ADX - Strategy
🔶 Pivot Percentile Analysis
1. Percentile Calculation:
- This involves calculating the percentile ranks for high and low prices over a set of predefined lengths.
- The percentile function is typically defined as:
- Percentile = Value at (P/100) × (N + 1)th position
- Where P is the desired percentile, and N is the number of data points.
2. Trend Strength Evaluation:
- The calculated percentiles for highs and lows are used to determine the strength of bullish and bearish trends.
- For instance, a high percentile rank in the high prices may indicate a strong bullish trend, and vice versa for bearish trends.
For more please check: Pivot Percentile Trend - Strategy
🔶 Strategy Integration
1. Combining SuperTrend and Pivot Percentile:
- The strategy synthesizes the insights from both AI-enhanced SuperTrend and Pivot Percentile analysis.
- It compares the trend direction indicated by the SuperTrend with the strength of the trend as suggested by the Pivot Percentile analysis.
2. Signal Generation:
- A trading signal is generated when both the AI-enhanced SuperTrend and the Pivot Percentile analysis agree on the trend direction.
- For instance, a bullish signal is generated when both the SuperTrend is bullish, and the Pivot Percentile analysis shows strength in bullish trends.
🔶 Risk Management and Filters
- ADX and DMI Filter: The strategy uses the Average Directional Index (ADX) and the Directional Movement Index (DMI) as filters to assess the trend's strength and direction.
- Dynamic Trailing Stop Loss: Based on the SuperTrend indicator, the strategy dynamically adjusts stop-loss levels to manage risk effectively.
This strategy stands out for its ability to combine real-time AI analysis with established technical indicators, offering traders a nuanced and responsive tool for navigating complex market conditions. The equations and algorithms involved are pivotal in accurately identifying market trends and potential trade opportunities.
█ Usage
To effectively use this strategy, traders should:
1. Understand the AI and Pivot Percentile Indicators: A clear grasp of how these indicators work will enable traders to make informed decisions.
2. Interpret the Signals Accurately: The strategy provides bullish, bearish, and neutral signals. Traders should align these signals with their market analysis and trading goals.
3. Monitor Market Conditions: Given that this strategy is sensitive to market dynamics, continuous monitoring is crucial for timely decision-making.
4. Adjust Settings as Needed: Traders should feel free to tweak the input parameters to suit their trading preferences and to respond to changing market conditions.
█Default Settings and Their Impact on Performance
1. Trading Direction (Default: "Both")
Effect: Determines whether the strategy will take long positions, short positions, or both. Adjusting this setting can align the strategy with the trader's market outlook or risk preference.
2. AI Settings (Neighbors: 3, Data Points: 24)
Neighbors: The number of nearest neighbors in the KNN algorithm. A higher number might smooth out noise but could miss subtle, recent changes. A lower number makes the model more sensitive to recent data but may increase noise.
Data Points: Defines the amount of historical data considered. More data points provide a broader context but may dilute recent trends' impact.
3. SuperTrend Settings (Length: 10, Factor: 3.0, MA Source: "WMA")
Length: Affects the sensitivity of the SuperTrend indicator. A longer length results in a smoother, less sensitive indicator, ideal for long-term trends.
Factor: Determines the bandwidth of the SuperTrend. A higher factor creates wider bands, capturing larger price movements but potentially missing short-term signals.
MA Source: The type of moving average used (e.g., WMA - Weighted Moving Average). Different MA types can affect the trend indicator's responsiveness and smoothness.
4. AI Trend Prediction Settings (Price Trend: 10, Prediction Trend: 80)
Price Trend and Prediction Trend Lengths: These settings define the lengths of weighted moving averages for price and SuperTrend, impacting the responsiveness and smoothness of the AI's trend predictions.
5. Pivot Percentile Settings (Length: 10)
Length: Influences the calculation of pivot percentiles. A shorter length makes the percentile more responsive to recent price changes, while a longer length offers a broader view of price trends.
6. ADX and DMI Settings (ADX Length: 14, Time Frame: 'D')
ADX Length: Defines the period for the Average Directional Index calculation. A longer period results in a smoother ADX line.
Time Frame: Sets the time frame for the ADX and DMI calculations, affecting the sensitivity to market changes.
7. Commission, Slippage, and Initial Capital
These settings relate to transaction costs and initial investment, directly impacting net profitability and strategy feasibility.
HolidayLibrary "Holiday"
- Full Control over Holidays and Daylight Savings Time (DLS)
The Holiday Library is an essential tool for traders and analysts who engage in backtesting and live trading . This comprehensive library enables the incorporation of crucial calendar elements - specifically Daylight Savings Time (DLS) adjustments and public holidays - into trading strategies and backtesting environments.
Key Features:
- DLS Adjustments: The library takes into account the shifts in time due to Daylight Savings. This feature is particularly vital for backtesting strategies, as DLS can impact trading hours, which in turn affects the volatility and liquidity in the market. Accurate DLS adjustments ensure that backtesting scenarios are as close to real-life conditions as possible.
- Comprehensive Holiday Metadata: The library includes a rich set of holiday metadata, allowing for the detailed scheduling of trading activities around public holidays. This feature is crucial for avoiding skewed results in backtesting, where holiday trading sessions might differ significantly in terms of volume and price movement.
- Customizable Holiday Schedules: Users can add or remove specific holidays, tailoring the library to fit various regional market schedules or specific trading requirements.
- Visualization Aids: The library supports on-chart labels, making it visually intuitive to identify holidays and DLS shifts directly on trading charts.
Use Cases:
1. Strategy Development: When developing trading strategies, it’s important to account for non-trading days and altered trading hours due to holidays and DLS. This library enables a realistic and accurate representation of these factors.
2. Risk Management: Trading around holidays can be riskier due to thinner liquidity and greater volatility. By integrating holiday data, traders can better manage their risk exposure.
3. Backtesting Accuracy: For backtesting to be effective, it must simulate the actual market conditions as closely as possible. Incorporating holidays and DLS adjustments contributes to more reliable and realistic backtesting results.
4. Global Trading: For traders active in multiple global markets, this library provides an easy way to handle different holiday schedules and DLS shifts across regions.
The Holiday Library is a versatile tool that enhances the precision and realism of trading simulations and strategy development . Its integration into the trading workflow is straightforward and beneficial for both novice and experienced traders.
EasterAlgo(_year)
Calculates the date of Easter Sunday for a given year using the Anonymous Gregorian algorithm.
`Gauss Algorithm for Easter Sunday` was developed by the mathematician Carl Friedrich Gauss
This algorithm is based on the cycles of the moon and the fact that Easter always falls on the first Sunday after the first ecclesiastical full moon that occurs on or after March 21.
While it's not considered to be 100% accurate due to rare exceptions, it does give the correct date in most cases.
It's important to note that Gauss's formula has been found to be inaccurate for some 21st-century years in the Gregorian calendar. Specifically, the next suggested failure years are 2038, 2051.
This function can be used for Good Friday (Friday before Easter), Easter Sunday, and Easter Monday (following Monday).
en.wikipedia.org
Parameters:
_year (int) : `int` - The year for which to calculate the date of Easter Sunday. This should be a four-digit year (YYYY).
Returns: tuple - The month (1-12) and day (1-31) of Easter Sunday for the given year.
easterInit()
Inits the date of Easter Sunday and Good Friday for a given year.
Returns: tuple - The month (1-12) and day (1-31) of Easter Sunday and Good Friday for the given year.
isLeapYear(_year)
Determine if a year is a leap year.
Parameters:
_year (int) : `int` - 4 digit year to check => YYYY
Returns: `bool` - true if input year is a leap year
method timezoneHelper(utc)
Helper function to convert UTC time.
Namespace types: series int, simple int, input int, const int
Parameters:
utc (int) : `int` - UTC time shift in hours.
Returns: `string`- UTC time string with shift applied.
weekofmonth()
Function to find the week of the month of a given Unix Time.
Returns: number - The week of the month of the specified UTC time.
dayLightSavingsAdjustedUTC(utc, adjustForDLS)
dayLightSavingsAdjustedUTC
Parameters:
utc (int) : `int` - The normal UTC timestamp to be used for reference.
adjustForDLS (bool) : `bool` - Flag indicating whether to adjust for daylight savings time (DLS).
Returns: `int` - The adjusted UTC timestamp for the given normal UTC timestamp.
getDayOfYear(monthOfYear, dayOfMonth, weekOfMonth, dayOfWeek, lastOccurrenceInMonth, holiday)
Function gets the day of the year of a given holiday (1-366)
Parameters:
monthOfYear (int)
dayOfMonth (int)
weekOfMonth (int)
dayOfWeek (int)
lastOccurrenceInMonth (bool)
holiday (string)
Returns: `int` - The day of the year of the holiday 1-366.
method buildMap(holidayMap, holiday, monthOfYear, weekOfMonth, dayOfWeek, dayOfMonth, lastOccurrenceInMonth, closingTime)
Function to build the `holidaysMap`.
Namespace types: map
Parameters:
holidayMap (map) : `map` - The map of holidays.
holiday (string) : `string` - The name of the holiday.
monthOfYear (int) : `int` - The month of the year of the holiday.
weekOfMonth (int) : `int` - The week of the month of the holiday.
dayOfWeek (int) : `int` - The day of the week of the holiday.
dayOfMonth (int) : `int` - The day of the month of the holiday.
lastOccurrenceInMonth (bool) : `bool` - Flag indicating whether the holiday is the last occurrence of the day in the month.
closingTime (int) : `int` - The closing time of the holiday.
Returns: `map` - The updated map of holidays
holidayInit(addHolidaysArray, removeHolidaysArray, defaultHolidays)
Initializes a HolidayStorage object with predefined US holidays.
Parameters:
addHolidaysArray (array) : `array` - The array of additional holidays to be added.
removeHolidaysArray (array) : `array` - The array of holidays to be removed.
defaultHolidays (bool) : `bool` - Flag indicating whether to include the default holidays.
Returns: `map` - The map of holidays.
Holidays(utc, addHolidaysArray, removeHolidaysArray, adjustForDLS, displayLabel, defaultHolidays)
Main function to build the holidays object, this is the only function from this library that should be needed. \
all functionality should be available through this function. \
With the exception of initializing a `HolidayMetaData` object to add a holiday or early close. \
\
**Default Holidays:** \
`DLS begin`, `DLS end`, `New Year's Day`, `MLK Jr. Day`, \
`Washington Day`, `Memorial Day`, `Independence Day`, `Labor Day`, \
`Columbus Day`, `Veterans Day`, `Thanksgiving Day`, `Christmas Day` \
\
**Example**
```
HolidayMetaData valentinesDay = HolidayMetaData.new(holiday="Valentine's Day", monthOfYear=2, dayOfMonth=14)
HolidayMetaData stPatricksDay = HolidayMetaData.new(holiday="St. Patrick's Day", monthOfYear=3, dayOfMonth=17)
HolidayMetaData addHolidaysArray = array.from(valentinesDay, stPatricksDay)
string removeHolidaysArray = array.from("DLS begin", "DLS end")
܂Holidays = Holidays(
܂ utc=-6,
܂ addHolidaysArray=addHolidaysArray,
܂ removeHolidaysArray=removeHolidaysArray,
܂ adjustForDLS=true,
܂ displayLabel=true,
܂ defaultHolidays=true,
܂ )
plot(Holidays.newHoliday ? open : na, title="newHoliday", color=color.red, linewidth=4, style=plot.style_circles)
```
Parameters:
utc (int) : `int` - The UTC time shift in hours
addHolidaysArray (array) : `array` - The array of additional holidays to be added
removeHolidaysArray (array) : `array` - The array of holidays to be removed
adjustForDLS (bool) : `bool` - Flag indicating whether to adjust for daylight savings time (DLS)
displayLabel (bool) : `bool` - Flag indicating whether to display a label on the chart
defaultHolidays (bool) : `bool` - Flag indicating whether to include the default holidays
Returns: `HolidayObject` - The holidays object | Holidays = (holidaysMap: map, newHoliday: bool, holiday: string, dayString: string)
HolidayMetaData
HolidayMetaData
Fields:
holiday (series string) : `string` - The name of the holiday.
dayOfYear (series int) : `int` - The day of the year of the holiday.
monthOfYear (series int) : `int` - The month of the year of the holiday.
dayOfMonth (series int) : `int` - The day of the month of the holiday.
weekOfMonth (series int) : `int` - The week of the month of the holiday.
dayOfWeek (series int) : `int` - The day of the week of the holiday.
lastOccurrenceInMonth (series bool)
closingTime (series int) : `int` - The closing time of the holiday.
utc (series int) : `int` - The UTC time shift in hours.
HolidayObject
HolidayObject
Fields:
holidaysMap (map) : `map` - The map of holidays.
newHoliday (series bool) : `bool` - Flag indicating whether today is a new holiday.
activeHoliday (series bool) : `bool` - Flag indicating whether today is an active holiday.
holiday (series string) : `string` - The name of the holiday.
dayString (series string) : `string` - The day of the week of the holiday.
Volume ForecastThe Volume Forecast indicator on TradingView is a comprehensive tool designed to analyze historical price action and project future market movements based on the average sizes of candles. Incorporating various data points such as candle high/low, open/close, and real volumes, Volume Forecast provides traders with a holistic view of market dynamics, allowing for more informed decision-making.
Key Features:
Multi-Data Source Analysis:
Volume Forecast seamlessly integrates multiple data sources, including candle high/low, open/close prices, and real volumes. By considering these diverse elements, the indicator offers a nuanced understanding of market conditions.
Historical Candle Size Analysis:
The indicator conducts a thorough analysis of historical candle sizes, capturing key data points to calculate the average candle size over a specified period. This historical context serves as the foundation for forecasting future candle sizes.
Customizable Forecasting Parameters:
Traders have the flexibility to fine-tune forecasting parameters to align with their trading strategies. Whether focusing on open/close relationships, high/low points, or real volumes, users can customize the indicator to suit their preferences.
Predictive Algorithm:
Volume Forecast employs a sophisticated predictive algorithm that leverages historical candle size data to project the potential size of upcoming candles. This algorithmic approach enhances the indicator's accuracy in forecasting market movements.
Visual Clarity:
The indicator provides a clear visual representation on the TradingView chart, displaying historical candle sizes and forecasted values. Color-coded elements and visual cues help traders quickly interpret the data, facilitating timely decision-making.
Adaptive Real-Time Updates:
Volume Forecast dynamically updates in real-time, ensuring traders have access to the latest information. This adaptability allows for swift adjustments to trading strategies in response to changing market conditions.
Comprehensive Market Compatibility:
Whether trading stocks, forex, cryptocurrencies, or commodities, Volume Forecast is compatible across various financial instruments and timeframes. This versatility makes it a valuable asset for traders in different markets.
User-Friendly Interface:
With an intuitive interface, Volume Forecast is accessible to traders of all experience levels. The indicator's user-friendly design streamlines the analysis process, making it easier for traders to incorporate it into their trading routines.
In summary, Volume Forecast is a robust TradingView indicator that combines historical candle size analysis with advanced forecasting techniques. By incorporating multiple data sources and offering customization options, it empowers traders to make more informed decisions in anticipation of market movements. Whether used independently or in conjunction with other tools, Volume Forecast is a valuable asset for traders seeking a comprehensive understanding of market dynamics.
Grid by Volatility (Expo)█ Overview
The Grid by Volatility is designed to provide a dynamic grid overlay on your price chart. This grid is calculated based on the volatility and adjusts in real-time as market conditions change. The indicator uses Standard Deviation to determine volatility and is useful for traders looking to understand price volatility patterns, determine potential support and resistance levels, or validate other trading signals.
█ How It Works
The indicator initiates its computations by assessing the market volatility through an established statistical model: the Standard Deviation. Following the volatility determination, the algorithm calculates a central equilibrium line—commonly referred to as the "mid-line"—on the chart to serve as a baseline for additional computations. Subsequently, upper and lower grid lines are algorithmically generated and plotted equidistantly from the central mid-line, with the distance being dictated by the previously calculated volatility metrics.
█ How to Use
Trend Analysis: The grid can be used to analyze the underlying trend of the asset. For example, if the price is above the Average Line and moves toward the Upper Range, it indicates a strong bullish trend.
Support and Resistance: The grid lines can act as dynamic support and resistance levels. Price tends to bounce off these levels or breakthrough, providing potential trade opportunities.
Volatility Gauge: The distance between the grid lines serves as a measure of market volatility. Wider lines indicate higher volatility, while narrower lines suggest low volatility.
█ Settings
Volatility Length: Number of bars to calculate the Standard Deviation (Default: 200)
Squeeze Adjustment: Multiplier for the Standard Deviation (Default: 6)
Grid Confirmation Length: Number of bars to calculate the weighted moving average for smoothing the grid lines (Default: 2)
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Machine Learning: kNN (New Approach)Description:
kNN is a very robust and simple method for data classification and prediction. It is very effective if the training data is large. However, it is distinguished by difficulty at determining its main parameter, K (a number of nearest neighbors), beforehand. The computation cost is also quite high because we need to compute distance of each instance to all training samples. Nevertheless, in algorithmic trading KNN is reported to perform on a par with such techniques as SVM and Random Forest. It is also widely used in the area of data science.
The input data is just a long series of prices over time without any particular features. The value to be predicted is just the next bar's price. The way that this problem is solved for both nearest neighbor techniques and for some other types of prediction algorithms is to create training records by taking, for instance, 10 consecutive prices and using the first 9 as predictor values and the 10th as the prediction value. Doing this way, given 100 data points in your time series you could create 10 different training records. It's possible to create even more training records than 10 by creating a new record starting at every data point. For instance, you could take the first 10 data points and create a record. Then you could take the 10 consecutive data points starting at the second data point, the 10 consecutive data points starting at the third data point, etc.
By default, shown are only 10 initial data points as predictor values and the 6th as the prediction value.
Here is a step-by-step workthrough on how to compute K nearest neighbors (KNN) algorithm for quantitative data:
1. Determine parameter K = number of nearest neighbors.
2. Calculate the distance between the instance and all the training samples. As we are dealing with one-dimensional distance, we simply take absolute value from the instance to value of x (| x – v |).
3. Rank the distance and determine nearest neighbors based on the K'th minimum distance.
4. Gather the values of the nearest neighbors.
5. Use average of nearest neighbors as the prediction value of the instance.
The original logic of the algorithm was slightly modified, and as a result at approx. N=17 the resulting curve nicely approximates that of the sma(20). See the description below. Beside the sma-like MA this algorithm also gives you a hint on the direction of the next bar move.
LinearRegressionLibraryLibrary "LinearRegressionLibrary" contains functions for fitting a regression line to the time series by means of different models, as well as functions for estimating the accuracy of the fit.
Linear regression algorithms:
RepeatedMedian(y, n, lastBar) applies repeated median regression (robust linear regression algorithm) to the input time series within the selected interval.
Parameters:
y :: float series, source time series (e.g. close)
n :: integer, the length of the selected time interval
lastBar :: integer, index of the last bar of the selected time interval (defines the position of the interval)
Output:
mSlope :: float, slope of the regression line
mInter :: float, intercept of the regression line
TheilSen(y, n, lastBar) applies the Theil-Sen estimator (robust linear regression algorithm) to the input time series within the selected interval.
Parameters:
y :: float series, source time series
n :: integer, the length of the selected time interval
lastBar :: integer, index of the last bar of the selected time interval (defines the position of the interval)
Output:
tsSlope :: float, slope of the regression line
tsInter :: float, intercept of the regression line
OrdinaryLeastSquares(y, n, lastBar) applies the ordinary least squares regression (non-robust) to the input time series within the selected interval.
Parameters:
y :: float series, source time series
n :: integer, the length of the selected time interval
lastBar :: integer, index of the last bar of the selected time interval (defines the position of the interval)
Output:
olsSlope :: float, slope of the regression line
olsInter :: float, intercept of the regression line
Model performance metrics:
metricRMSE(y, n, lastBar, slope, intercept) returns the Root-Mean-Square Error (RMSE) of the regression. The better the model, the lower the RMSE.
Parameters:
y :: float series, source time series (e.g. close)
n :: integer, the length of the selected time interval
lastBar :: integer, index of the last bar of the selected time interval (defines the position of the interval)
slope :: float, slope of the evaluated linear regression line
intercept :: float, intercept of the evaluated linear regression line
Output:
rmse :: float, RMSE value
metricMAE(y, n, lastBar, slope, intercept) returns the Mean Absolute Error (MAE) of the regression. MAE is is similar to RMSE but is less sensitive to outliers. The better the model, the lower the MAE.
Parameters:
y :: float series, source time series
n :: integer, the length of the selected time interval
lastBar :: integer, index of the last bar of the selected time interval (defines the position of the interval)
slope :: float, slope of the evaluated linear regression line
intercept :: float, intercept of the evaluated linear regression line
Output:
mae :: float, MAE value
metricR2(y, n, lastBar, slope, intercept) returns the coefficient of determination (R squared) of the regression. The better the linear regression fits the data (compared to the sample mean), the closer the value of the R squared is to 1.
Parameters:
y :: float series, source time series
n :: integer, the length of the selected time interval
lastBar :: integer, index of the last bar of the selected time interval (defines the position of the interval)
slope :: float, slope of the evaluated linear regression line
intercept :: float, intercept of the evaluated linear regression line
Output:
Rsq :: float, R-sqared score
Usage example:
//@version=5
indicator('ExampleLinReg', overlay=true)
// import the library
import tbiktag/LinearRegressionLibrary/1 as linreg
// define the studied interval: last 100 bars
int Npoints = 100
int lastBar = bar_index
int firstBar = bar_index - Npoints
// apply repeated median regression to the closing price time series within the specified interval
{square bracket}slope, intercept{square bracket} = linreg.RepeatedMedian(close, Npoints, lastBar)
// calculate the root-mean-square error of the obtained linear fit
rmse = linreg.metricRMSE(close, Npoints, lastBar, slope, intercept)
// plot the line and print the RMSE value
float y1 = intercept
float y2 = intercept + slope * (Npoints - 1)
if barstate.islast
{indent} line.new(firstBar,y1, lastBar,y2)
{indent} label.new(lastBar,y2,text='RMSE = '+str.format("{0,number,#.#}", rmse))
Pulu's 3 Moving Averages
Pulu's 3 Moving Averages
Release version 1, date 2021-09-28
This script allows you to customize three sets of moving averages, turn on/off, set color and parameters. It also tags the start date of the last set of moving average if there is. This, release version 1, supports eight moving average algorithms:
ALMA, Arnaud Legoux Moving Average
EMA, Exponential Moving Average
RMA, Adjusted exponential moving average (aka Wilder’s EMA)
SMA, Simple Moving Average
SWMA, Symmetrically-Weighted Moving Average
VWAP, Volume-Weighted Average Price
VWMA, Volume-Weighted Moving Average
WMA, Weighted Moving Average
The availability and function parameters
Func. Availability Parameters
ALMA
MA1, MA2, MA3
source
length
offset
sigma
EMA
RMA
SMA
VWMA
WMA
MA1, MA2, MA3
source
length
SWMA
VWAP
MA1
source
Parameters
Parameter Description
source the series of values to process. The default is to use the closing price to calculate the moving average.
length an integer value that defines the number of bars to calculate the moving average on. The SWMA and VWAP do not use this parameter.
ALMA offset a floating-point value that controls the tradeoff between smoothness (with a value closer to 1) and responsiveness (with a value closer to 0). This parameter is only used by ALMA.
ALMA sigma a floating-point value that specifies the ALMA’s smoothness. The larger this value, the smoother the moving average is. This parameter is only used by ALMA.
I'm not sure if it is needed, so I do not let the three Moving Averages of the script to have indivial algorithm setting. Because that will involve much complicated condition testing and use up more TradingView script lines limit. If you need to combine different algorithms in the three sets of moving averages, or have other ideas, leave a message to let me know; maybe I will try it in the next update.
我不確定是否需要,所以我沒有讓腳本的三組移動平均線有各別的算法設置。因為這將涉及更多複雜的條件測試,並使用更多 TradingView 腳本列數限制。如果您需要在三組均線中組合不同的算法,或者有其他想法,請留言告訴我;也許我會在下一次更新中嘗試。
Cycles StrategyThis is back-testable strategy is a modified version of the Stochastic strategy. It is meant to accompany the modified Stochastic indicator: "Cycles".
Modifications to the Stochastic strategy include;
1. Programmable settings for the Stochastic Periods (%K, %D and Smooth %K).
2. Programmable settings for the MACD Periods (Fast, Slow, Smoothing)
3. Programmable thresholds for %K, to qualify a potential entry strategy.
4. Programmable thresholds for %D, to qualify a potential exit strategy.
5. Buttons to choose which components to use in the trading algorithm.
6. Choose the month and year to back test.
The trading algorithm:
1. When %K exceeds the upper/lower threshold and then hooks down/up, in the direction of the Moving Average (MA). This is the minimum entry qualification.
2. When %D exceeds the lower/upper threshold and angled in the direction of the trade, is the exit qualification.
3. Additional entry filters include the direction of MACD, Signal and %D. Also, the "cliff", being a long entry is a higher high or a short entry is a lower low.
4. Strategy can only go "Long" or "Short" depending on the selected setting.
5. By matching the settings in the "Cycles" indicator, you can (almost) see what the strategy is doing.
6. Be sure to select the "Recalculate" buttons, to recalculate on every new Tick, for best results.
Please click the Like button and leave a comment if you appreciate this script. Improvements will be implemented as time goes on.
I am not a licensed trade advisor. This strategy is for entertainment only. Use at your own risk!
[blackcat] L3 Improved Dual Ehlers BPF for Volatility DetectionOVERVIEW
This script implements an advanced L3 Improved Dual Ehlers Bandpass Filter (BPF) for volatility detection, combining both L1 and L2 calculation methods to create a comprehensive trading signal. The script leverages John Ehlers' sophisticated digital signal processing techniques to identify market cycles and extract meaningful trading signals from price action. By combining multiple cycle detection methods and filtering approaches, it provides traders with a powerful tool for identifying trend changes, momentum shifts, and potential reversal points across various market conditions and timeframes. The L3 approach uniquely combines the outputs of both L1 (01 range) and L2 (-11 range) methods, creating a signal that ranges from -1~2 and provides enhanced sensitivity to market dynamics.
FEATURES
🔄 Dual Calculation Methods: Choose between L1 (01 range), L2 (-11 range), or combine both for L3 signal (-1~2 range) to match your trading style
📊 Multiple Cycle Detection: Seven different dominant cycle calculation methods including HoDyDC (Hilbert Transform Dominant Cycle), PhAcDC (Phase Accumulation Dominant Cycle), DuDiDC (Duane Dominant Cycle), CycPer (Cycle Period), BPZC (Bandpass Zero Crossing), AutoPer (Autocorrelation Period), and DFTDC (Discrete Fourier Transform Dominant Cycle)
🎛️ Flexible Mixing Options: Six sophisticated mixing methods including weighted averaging, simple sum, difference extraction, dominant-only, subdominant-only, and adaptive mixing that adjusts based on signal strength
🌊 Bandpass Filtering: Precise bandwidth control for both dominant and subdominant filters, allowing fine-tuning of frequency response characteristics
📈 Advanced Divergence Detection: Robust algorithm for identifying bullish and bearish divergences with customizable lookback periods and range constraints
🎨 Comprehensive Visualization: Extensive customization options for all signals, colors, plot styles, and display elements
🔔 Comprehensive Alert System: Built-in alerts for divergence signals, zero line crosses, and various market conditions
📊 Real-time Cycle Information: Optional display of dominant and subdominant cycle periods for educational purposes
🔄 Adaptive Signal Processing: Dynamic adjustment of parameters based on market conditions and volatility
🎯 Multiple Signal Outputs: Simultaneous generation of L1, L2, and L3 signals for different trading strategies
HOW TO USE
Select Calculation Method: Choose between "l1" (01 range), "l2" (-11 range), or "both" (L3, -1~2 range) in the Calculation Method settings based on your preferred signal characteristics
Configure Cycle Detection: Select your preferred Dominant Cycle Method from the seven available options and adjust the Cycle Part parameter (0.1-0.9) to fine-tune cycle sensitivity
Set Subdominant Parameters: Configure the subdominant cycle either as a ratio of the dominant cycle or as a fixed period, depending on your analysis approach
Adjust Filter Bandwidth: Fine-tune the bandwidth settings for both dominant and subdominant filters (0.1-1.0) to control the frequency response and signal smoothing
Choose Mixing Method: Select how to combine the filters - weighted averaging for balance, sum for maximum sensitivity, difference for trend isolation, or adaptive mixing for dynamic response
Configure Smoothing: Select from SMA, EMA, or HMA smoothing methods with adjustable length (1-20 bars) to reduce noise in the final signal
Customize Visualization: Enable/disable individual plots, divergence detection, zero line, fill areas, and customize all colors to match your chart preferences
Set Divergence Parameters: Configure lookback ranges (5-60 bars) for divergence detection to match your trading timeframe and style
Monitor Signals: Watch for crosses above/below zero line and divergence patterns, paying attention to signal strength and consistency
Set Up Alerts: Configure alerts for divergence signals, zero line crosses, and other market conditions to stay informed of trading opportunities
LIMITATIONS
The script requires the dc_ta library from blackcat1402 for several advanced cycle calculation methods (HoDyDC, PhAcDC, DuDiDC, CycPer, BPZC, AutoPer, DFTDC)
L1 method operates in 01 range while L2 method uses -11 range, requiring different interpretation approaches
Combined L3 signal ranges from -1~2 when both methods are selected, creating unique signal characteristics that traders must adapt to
Divergence detection accuracy depends on proper lookback period settings and market volatility conditions
Performance may be impacted with very long lookback ranges (>60 bars) or when multiple plots are simultaneously enabled
The script is designed for non-overlay use and may not display correctly on certain chart types or with conflicting indicators
Adaptive mixing method requires careful threshold tuning to avoid excessive signal fluctuation
Cycle detection algorithms may produce unreliable results during low volatility or highly choppy market conditions
The script assumes regular price data and may not perform optimally with irregular or gapped price sequences
NOTES
The script implements advanced mathematical calculations including bandpass filters, Hilbert transforms, and various cycle detection algorithms developed by John Ehlers
For optimal results, experiment with different cycle detection methods and bandwidth settings across various market conditions and timeframes
The adaptive mixing method automatically adjusts weights based on signal strength, providing dynamic response to changing market conditions
Divergence detection works best when the "Plot Divergence" option is enabled and when combined with other technical analysis tools
Zero line crosses can indicate potential trend changes or momentum shifts, especially when confirmed by volume or other indicators
The script includes commented code for cycle information display that can be enabled if you want to monitor cycle periods in real-time
Different calculation methods may perform better in different market environments - L1 tends to be smoother while L2 is more sensitive
The subdominant cycle helps filter out noise and provides additional confirmation for signals generated by the dominant cycle
Bandwidth settings control the filter's frequency response - lower values provide more smoothing while higher values increase sensitivity
Mixing methods offer different approaches to combining signals - weighted averaging is generally most reliable for most trading applications
THANKS
Special thanks to John Ehlers for his pioneering work in cycle analysis and digital signal processing for financial markets. This script implements and significantly improves upon his bandpass filter methodology, incorporating multiple advanced techniques from his extensive body of work. Also heartfelt thanks to blackcat1402 for the dc_ta library that provides essential cycle calculation methods and for maintaining such a valuable resource for the Pine Script community. Additional appreciation to the TradingView platform for providing the tools and environment that make sophisticated technical analysis accessible to traders worldwide. This script represents a collaborative effort in advancing the field of algorithmic trading and technical analysis.
Langlands-Operadic Möbius Vortex (LOMV)Langlands-Operadic Möbius Vortex (LOMV)
Where Pure Mathematics Meets Market Reality
A Revolutionary Synthesis of Number Theory, Category Theory, and Market Dynamics
🎓 THEORETICAL FOUNDATION
The Langlands-Operadic Möbius Vortex represents a groundbreaking fusion of three profound mathematical frameworks that have never before been combined for market analysis:
The Langlands Program: Harmonic Analysis in Markets
Developed by Robert Langlands (Fields Medal recipient), the Langlands Program creates bridges between number theory, algebraic geometry, and harmonic analysis. In our indicator:
L-Function Implementation:
- Utilizes the Möbius function μ(n) for weighted price analysis
- Applies Riemann zeta function convergence principles
- Calculates quantum harmonic resonance between -2 and +2
- Measures deep mathematical patterns invisible to traditional analysis
The L-Function core calculation employs:
L_sum = Σ(return_val × μ(n) × n^(-s))
Where s is the critical strip parameter (0.5-2.5), controlling mathematical precision and signal smoothness.
Operadic Composition Theory: Multi-Strategy Democracy
Category theory and operads provide the mathematical framework for composing multiple trading strategies into a unified signal. This isn't simple averaging - it's mathematical composition using:
Strategy Composition Arity (2-5 strategies):
- Momentum analysis via RSI transformation
- Mean reversion through Bollinger Band mathematics
- Order Flow Polarity Index (revolutionary T3-smoothed volume analysis)
- Trend detection using Directional Movement
- Higher timeframe momentum confirmation
Agreement Threshold System: Democratic voting where strategies must reach consensus before signal generation. This prevents false signals during market uncertainty.
Möbius Function: Number Theory in Action
The Möbius function μ(n) forms the mathematical backbone:
- μ(n) = 1 if n is a square-free positive integer with even number of prime factors
- μ(n) = -1 if n is a square-free positive integer with odd number of prime factors
- μ(n) = 0 if n has a squared prime factor
This creates oscillating weights that reveal hidden market periodicities and harmonic structures.
🔧 COMPREHENSIVE INPUT SYSTEM
Langlands Program Parameters
Modular Level N (5-50, default 30):
Primary lookback for quantum harmonic analysis. Optimized by timeframe:
- Scalping (1-5min): 15-25
- Day Trading (15min-1H): 25-35
- Swing Trading (4H-1D): 35-50
- Asset-specific: Crypto 15-25, Stocks 30-40, Forex 35-45
L-Function Critical Strip (0.5-2.5, default 1.5):
Controls Riemann zeta convergence precision:
- Higher values: More stable, smoother signals
- Lower values: More reactive, catches quick moves
- High frequency: 0.8-1.2, Medium: 1.3-1.7, Low: 1.8-2.3
Frobenius Trace Period (5-50, default 21):
Galois representation lookback for price-volume correlation:
- Measures harmonic relationships in market flows
- Scalping: 8-15, Day Trading: 18-25, Swing: 25-40
HTF Multi-Scale Analysis:
Higher timeframe context prevents trading against major trends:
- Provides market bias and filters signals
- Improves win rates by 15-25% through trend alignment
Operadic Composition Parameters
Strategy Composition Arity (2-5, default 4):
Number of algorithms composed for final signal:
- Conservative: 4-5 strategies (higher confidence)
- Moderate: 3-4 strategies (balanced approach)
- Aggressive: 2-3 strategies (more frequent signals)
Category Agreement Threshold (2-5, default 3):
Democratic voting minimum for signal generation:
- Higher agreement: Fewer but higher quality signals
- Lower agreement: More signals, potential false positives
Swiss-Cheese Mixing (0.1-0.5, default 0.382):
Golden ratio φ⁻¹ based blending of trend factors:
- 0.382 is φ⁻¹, optimal for natural market fractals
- Higher values: Stronger trend following
- Lower values: More contrarian signals
OFPI Configuration:
- OFPI Length (5-30, default 14): Order Flow calculation period
- T3 Smoothing (3-10, default 5): Advanced exponential smoothing
- T3 Volume Factor (0.5-1.0, default 0.7): Smoothing aggressiveness control
Unified Scoring System
Component Weights (sum ≈ 1.0):
- L-Function Weight (0.1-0.5, default 0.3): Mathematical harmony emphasis
- Galois Rank Weight (0.1-0.5, default 0.2): Market structure complexity
- Operadic Weight (0.1-0.5, default 0.3): Multi-strategy consensus
- Correspondence Weight (0.1-0.5, default 0.2): Theory-practice alignment
Signal Threshold (0.5-10.0, default 5.0):
Quality filter producing:
- 8.0+: EXCEPTIONAL signals only
- 6.0-7.9: STRONG signals
- 4.0-5.9: MODERATE signals
- 2.0-3.9: WEAK signals
🎨 ADVANCED VISUAL SYSTEM
Multi-Dimensional Quantum Aura Bands
Five-layer resonance field showing market energy:
- Colors: Theme-matched gradients (Quantum purple, Holographic cyan, etc.)
- Expansion: Dynamic based on score intensity and volatility
- Function: Multi-timeframe support/resistance zones
Morphism Flow Portals
Category theory visualization showing market topology:
- Green/Cyan Portals: Bullish mathematical flow
- Red/Orange Portals: Bearish mathematical flow
- Size/Intensity: Proportional to signal strength
- Recursion Depth (1-8): Nested patterns for flow evolution
Fractal Grid System
Dynamic support/resistance with projected L-Scores:
- Multiple Timeframes: 10, 20, 30, 40, 50-period highs/lows
- Smart Spacing: Prevents level overlap using ATR-based minimum distance
- Projections: Estimated signal scores when price reaches levels
- Usage: Precise entry/exit timing with mathematical confirmation
Wick Pressure Analysis
Rejection level prediction using candle mathematics:
- Upper Wicks: Selling pressure zones (purple/red lines)
- Lower Wicks: Buying pressure zones (purple/green lines)
- Glow Intensity (1-8): Visual emphasis and line reach
- Application: Confluence with fractal grid creates high-probability zones
Regime Intensity Heatmap
Background coloring showing market energy:
- Black/Dark: Low activity, range-bound markets
- Purple Glow: Building momentum and trend development
- Bright Purple: High activity, strong directional moves
- Calculation: Combines trend, momentum, volatility, and score intensity
Six Professional Themes
- Quantum: Purple/violet for general trading and mathematical focus
- Holographic: Cyan/magenta optimized for cryptocurrency markets
- Crystalline: Blue/turquoise for conservative, stability-focused trading
- Plasma: Gold/magenta for high-energy volatility trading
- Cosmic Neon: Bright neon colors for maximum visibility and aggressive trading
📊 INSTITUTIONAL-GRADE DASHBOARD
Unified AI Score Section
- Total Score (-10 to +10): Primary decision metric
- >5: Strong bullish signals
- <-5: Strong bearish signals
- Quality ratings: EXCEPTIONAL > STRONG > MODERATE > WEAK
- Component Analysis: Individual L-Function, Galois, Operadic, and Correspondence contributions
Order Flow Analysis
Revolutionary OFPI integration:
- OFPI Value (-100% to +100%): Real buying vs selling pressure
- Visual Gauge: Horizontal bar chart showing flow intensity
- Momentum Status: SHIFTING, ACCELERATING, STRONG, MODERATE, or WEAK
- Trading Application: Flow shifts often precede major moves
Signal Performance Tracking
- Win Rate Monitoring: Real-time success percentage with emoji indicators
- Signal Count: Total signals generated for frequency analysis
- Current Position: LONG, SHORT, or NONE with P&L tracking
- Volatility Regime: HIGH, MEDIUM, or LOW classification
Market Structure Analysis
- Möbius Field Strength: Mathematical field oscillation intensity
- CHAOTIC: High complexity, use wider stops
- STRONG: Active field, normal position sizing
- MODERATE: Balanced conditions
- WEAK: Low activity, consider smaller positions
- HTF Trend: Higher timeframe bias (BULL/BEAR/NEUTRAL)
- Strategy Agreement: Multi-algorithm consensus level
Position Management
When in trades, displays:
- Entry Price: Original signal price
- Current P&L: Real-time percentage with risk level assessment
- Duration: Bars in trade for timing analysis
- Risk Level: HIGH/MEDIUM/LOW based on current exposure
🚀 SIGNAL GENERATION LOGIC
Balanced Long/Short Architecture
The indicator generates signals through multiple convergent pathways:
Long Entry Conditions:
- Score threshold breach with algorithmic agreement
- Strong bullish order flow (OFPI > 0.15) with positive composite signal
- Bullish pattern recognition with mathematical confirmation
- HTF trend alignment with momentum shifting
- Extreme bullish OFPI (>0.3) with any positive score
Short Entry Conditions:
- Score threshold breach with bearish agreement
- Strong bearish order flow (OFPI < -0.15) with negative composite signal
- Bearish pattern recognition with mathematical confirmation
- HTF trend alignment with momentum shifting
- Extreme bearish OFPI (<-0.3) with any negative score
Exit Logic:
- Score deterioration below continuation threshold
- Signal quality degradation
- Opposing order flow acceleration
- 10-bar minimum between signals prevents overtrading
⚙️ OPTIMIZATION GUIDELINES
Asset-Specific Settings
Cryptocurrency Trading:
- Modular Level: 15-25 (capture volatility)
- L-Function Precision: 0.8-1.3 (reactive to price swings)
- OFPI Length: 10-20 (fast correlation shifts)
- Cascade Levels: 5-7, Theme: Holographic
Stock Index Trading:
- Modular Level: 25-35 (balanced trending)
- L-Function Precision: 1.5-1.8 (stable patterns)
- OFPI Length: 14-20 (standard correlation)
- Cascade Levels: 4-5, Theme: Quantum
Forex Trading:
- Modular Level: 35-45 (smooth trends)
- L-Function Precision: 1.6-2.1 (high smoothing)
- OFPI Length: 18-25 (disable volume amplification)
- Cascade Levels: 3-4, Theme: Crystalline
Timeframe Optimization
Scalping (1-5 minute charts):
- Reduce all lookback parameters by 30-40%
- Increase L-Function precision for noise reduction
- Enable all visual elements for maximum information
- Use Small dashboard to save screen space
Day Trading (15 minute - 1 hour):
- Use default parameters as starting point
- Adjust based on market volatility
- Normal dashboard provides optimal information density
- Focus on OFPI momentum shifts for entries
Swing Trading (4 hour - Daily):
- Increase lookback parameters by 30-50%
- Higher L-Function precision for stability
- Large dashboard for comprehensive analysis
- Emphasize HTF trend alignment
🏆 ADVANCED TRADING STRATEGIES
The Mathematical Confluence Method
1. Wait for Fractal Grid level approach
2. Confirm with projected L-Score > threshold
3. Verify OFPI alignment with direction
4. Enter on portal signal with quality ≥ STRONG
5. Exit on score deterioration or opposing flow
The Regime Trading System
1. Monitor Aether Flow background intensity
2. Trade aggressively during bright purple periods
3. Reduce position size during dark periods
4. Use Möbius Field strength for stop placement
5. Align with HTF trend for maximum probability
The OFPI Momentum Strategy
1. Watch for momentum shifting detection
2. Confirm with accelerating flow in direction
3. Enter on immediate portal signal
4. Scale out at Fibonacci levels
5. Exit on flow deceleration or reversal
⚠️ RISK MANAGEMENT INTEGRATION
Mathematical Position Sizing
- Use Galois Rank for volatility-adjusted sizing
- Möbius Field strength determines stop width
- Fractal Dimension guides maximum exposure
- OFPI momentum affects entry timing
Signal Quality Filtering
- Trade only STRONG or EXCEPTIONAL quality signals
- Increase position size with higher agreement levels
- Reduce risk during CHAOTIC Möbius field periods
- Respect HTF trend alignment for directional bias
🔬 DEVELOPMENT JOURNEY
Creating the LOMV was an extraordinary mathematical undertaking that pushed the boundaries of what's possible in technical analysis. This indicator almost didn't happen. The theoretical complexity nearly proved insurmountable.
The Mathematical Challenge
Implementing the Langlands Program required deep research into:
- Number theory and the Möbius function
- Riemann zeta function convergence properties
- L-function analytical continuation
- Galois representations in finite fields
The mathematical literature spans decades of pure mathematics research, requiring translation from abstract theory to practical market application.
The Computational Complexity
Operadic composition theory demanded:
- Category theory implementation in Pine Script
- Multi-dimensional array management for strategy composition
- Real-time democratic voting algorithms
- Performance optimization for complex calculations
The Integration Breakthrough
Bringing together three disparate mathematical frameworks required:
- Novel approaches to signal weighting and combination
- Revolutionary Order Flow Polarity Index development
- Advanced T3 smoothing implementation
- Balanced signal generation preventing directional bias
Months of intensive research culminated in breakthrough moments when the mathematics finally aligned with market reality. The result is an indicator that reveals market structure invisible to conventional analysis while maintaining practical trading utility.
🎯 PRACTICAL IMPLEMENTATION
Getting Started
1. Apply indicator with default settings
2. Select appropriate theme for your markets
3. Observe dashboard metrics during different market conditions
4. Practice signal identification without trading
5. Gradually adjust parameters based on observations
Signal Confirmation Process
- Never trade on score alone - verify quality rating
- Confirm OFPI alignment with intended direction
- Check fractal grid level proximity for timing
- Ensure Möbius field strength supports position size
- Validate against HTF trend for bias confirmation
Performance Monitoring
- Track win rate in dashboard for strategy assessment
- Monitor component contributions for optimization
- Adjust threshold based on desired signal frequency
- Document performance across different market regimes
🌟 UNIQUE INNOVATIONS
1. First Integration of Langlands Program mathematics with practical trading
2. Revolutionary OFPI with T3 smoothing and momentum detection
3. Operadic Composition using category theory for signal democracy
4. Dynamic Fractal Grid with projected L-Score calculations
5. Multi-Dimensional Visualization through morphism flow portals
6. Regime-Adaptive Background showing market energy intensity
7. Balanced Signal Generation preventing directional bias
8. Professional Dashboard with institutional-grade metrics
📚 EDUCATIONAL VALUE
The LOMV serves as both a practical trading tool and an educational gateway to advanced mathematics. Traders gain exposure to:
- Pure mathematics applications in markets
- Category theory and operadic composition
- Number theory through Möbius function implementation
- Harmonic analysis via L-function calculations
- Advanced signal processing through T3 smoothing
⚖️ RESPONSIBLE USAGE
This indicator represents advanced mathematical research applied to market analysis. While the underlying mathematics are rigorously implemented, markets remain inherently unpredictable.
Key Principles:
- Use as part of comprehensive trading strategy
- Implement proper risk management at all times
- Backtest thoroughly before live implementation
- Understand that past performance does not guarantee future results
- Never risk more than you can afford to lose
The mathematics reveal deep market structure, but successful trading requires discipline, patience, and sound risk management beyond any indicator.
🔮 CONCLUSION
The Langlands-Operadic Möbius Vortex represents a quantum leap forward in technical analysis, bringing PhD-level pure mathematics to practical trading while maintaining visual elegance and usability.
From the harmonic analysis of the Langlands Program to the democratic composition of operadic theory, from the number-theoretic precision of the Möbius function to the revolutionary Order Flow Polarity Index, every component works in mathematical harmony to reveal the hidden order within market chaos.
This is more than an indicator - it's a mathematical lens that transforms how you see and understand market structure.
Trade with mathematical precision. Trade with the LOMV.
*"Mathematics is the language with which God has written the universe." - Galileo Galilei*
*In markets, as in nature, profound mathematical beauty underlies apparent chaos. The LOMV reveals this hidden order.*
— Dskyz, Trade with insight. Trade with anticipation.
MLExtensions_CoreLibrary "MLExtensions_Core"
A set of extension methods for a novel implementation of a Approximate Nearest Neighbors (ANN) algorithm in Lorentzian space, focused on computation.
normalizeDeriv(src, quadraticMeanLength)
Returns the smoothed hyperbolic tangent of the input series.
Parameters:
src (float) : The input series (i.e., the first-order derivative for price).
quadraticMeanLength (int) : The length of the quadratic mean (RMS).
Returns: nDeriv The normalized derivative of the input series.
normalize(src, min, max)
Rescales a source value with an unbounded range to a target range.
Parameters:
src (float) : The input series
min (float) : The minimum value of the unbounded range
max (float) : The maximum value of the unbounded range
Returns: The normalized series
rescale(src, oldMin, oldMax, newMin, newMax)
Rescales a source value with a bounded range to anther bounded range
Parameters:
src (float) : The input series
oldMin (float) : The minimum value of the range to rescale from
oldMax (float) : The maximum value of the range to rescale from
newMin (float) : The minimum value of the range to rescale to
newMax (float) : The maximum value of the range to rescale to
Returns: The rescaled series
getColorShades(color)
Creates an array of colors with varying shades of the input color
Parameters:
color (color) : The color to create shades of
Returns: An array of colors with varying shades of the input color
getPredictionColor(prediction, neighborsCount, shadesArr)
Determines the color shade based on prediction percentile
Parameters:
prediction (float) : Value of the prediction
neighborsCount (int) : The number of neighbors used in a nearest neighbors classification
shadesArr (array) : An array of colors with varying shades of the input color
Returns: shade Color shade based on prediction percentile
color_green(prediction)
Assigns varying shades of the color green based on the KNN classification
Parameters:
prediction (float) : Value (int|float) of the prediction
Returns: color
color_red(prediction)
Assigns varying shades of the color red based on the KNN classification
Parameters:
prediction (float) : Value of the prediction
Returns: color
tanh(src)
Returns the the hyperbolic tangent of the input series. The sigmoid-like hyperbolic tangent function is used to compress the input to a value between -1 and 1.
Parameters:
src (float) : The input series (i.e., the normalized derivative).
Returns: tanh The hyperbolic tangent of the input series.
dualPoleFilter(src, lookback)
Returns the smoothed hyperbolic tangent of the input series.
Parameters:
src (float) : The input series (i.e., the hyperbolic tangent).
lookback (int) : The lookback window for the smoothing.
Returns: filter The smoothed hyperbolic tangent of the input series.
tanhTransform(src, smoothingFrequency, quadraticMeanLength)
Returns the tanh transform of the input series.
Parameters:
src (float) : The input series (i.e., the result of the tanh calculation).
smoothingFrequency (int)
quadraticMeanLength (int)
Returns: signal The smoothed hyperbolic tangent transform of the input series.
n_rsi(src, n1, n2)
Returns the normalized RSI ideal for use in ML algorithms.
Parameters:
src (float) : The input series (i.e., the result of the RSI calculation).
n1 (simple int) : The length of the RSI.
n2 (simple int) : The smoothing length of the RSI.
Returns: signal The normalized RSI.
n_cci(src, n1, n2)
Returns the normalized CCI ideal for use in ML algorithms.
Parameters:
src (float) : The input series (i.e., the result of the CCI calculation).
n1 (simple int) : The length of the CCI.
n2 (simple int) : The smoothing length of the CCI.
Returns: signal The normalized CCI.
n_wt(src, n1, n2)
Returns the normalized WaveTrend Classic series ideal for use in ML algorithms.
Parameters:
src (float) : The input series (i.e., the result of the WaveTrend Classic calculation).
n1 (simple int)
n2 (simple int)
Returns: signal The normalized WaveTrend Classic series.
n_adx(highSrc, lowSrc, closeSrc, n1)
Returns the normalized ADX ideal for use in ML algorithms.
Parameters:
highSrc (float) : The input series for the high price.
lowSrc (float) : The input series for the low price.
closeSrc (float) : The input series for the close price.
n1 (simple int) : The length of the ADX.
regime_filter(src, threshold, useRegimeFilter)
Parameters:
src (float)
threshold (float)
useRegimeFilter (bool)
filter_adx(src, length, adxThreshold, useAdxFilter)
filter_adx
Parameters:
src (float) : The source series.
length (simple int) : The length of the ADX.
adxThreshold (int) : The ADX threshold.
useAdxFilter (bool) : Whether to use the ADX filter.
Returns: The ADX.
filter_volatility(minLength, maxLength, sensitivityMultiplier, useVolatilityFilter)
filter_volatility
Parameters:
minLength (simple int) : The minimum length of the ATR.
maxLength (simple int) : The maximum length of the ATR.
sensitivityMultiplier (float) : Multiplier for the historical ATR to control sensitivity.
useVolatilityFilter (bool) : Whether to use the volatility filter.
Returns: Boolean indicating whether or not to let the signal pass through the filter.
Volume Predictor [PhenLabs]📊 Volume Predictor
Version: PineScript™ v6
📌 Description
The Volume Predictor is an advanced technical indicator that leverages machine learning and statistical modeling techniques to forecast future trading volume. This innovative tool analyzes historical volume patterns to predict volume levels for upcoming bars, providing traders with valuable insights into potential market activity. By combining multiple prediction algorithms with pattern recognition techniques, the indicator delivers forward-looking volume projections that can enhance trading strategies and market analysis.
🚀 Points of Innovation:
Machine learning pattern recognition using Lorentzian distance metrics
Multi-algorithm prediction framework with algorithm selection
Ensemble learning approach combining multiple prediction methods
Real-time accuracy metrics with visual performance dashboard
Dynamic volume normalization for consistent scale representation
Forward-looking visualization with configurable prediction horizon
🔧 Core Components
Pattern Recognition Engine : Identifies similar historical volume patterns using Lorentzian distance metrics
Multi-Algorithm Framework : Offers five distinct prediction methods with configurable parameters
Volume Normalization : Converts raw volume to percentage scale for consistent analysis
Accuracy Tracking : Continuously evaluates prediction performance against actual outcomes
Advanced Visualization : Displays actual vs. predicted volume with configurable future bar projections
Interactive Dashboard : Shows real-time performance metrics and prediction accuracy
🔥 Key Features
The indicator provides comprehensive volume analysis through:
Multiple Prediction Methods : Choose from Lorentzian, KNN Pattern, Ensemble, EMA, or Linear Regression algorithms
Pattern Matching : Identifies similar historical volume patterns to project future volume
Adaptive Predictions : Generates volume forecasts for multiple bars into the future
Performance Tracking : Calculates and displays real-time prediction accuracy metrics
Normalized Scale : Presents volume as a percentage of historical maximums for consistent analysis
Customizable Visualization : Configure how predictions and actual volumes are displayed
Interactive Dashboard : View algorithm performance metrics in a customizable information panel
🎨 Visualization
Actual Volume Columns : Color-coded green/red bars showing current normalized volume
Prediction Columns : Semi-transparent blue columns representing predicted volume levels
Future Bar Projections : Forward-looking volume predictions with configurable transparency
Prediction Dots : Optional white dots highlighting future prediction points
Reference Lines : Visual guides showing the normalized volume scale
Performance Dashboard : Customizable panel displaying prediction method and accuracy metrics
📖 Usage Guidelines
History Lookback Period
Default: 20
Range: 5-100
This setting determines how many historical bars are analyzed for pattern matching. A longer period provides more historical data for pattern recognition but may reduce responsiveness to recent changes. A shorter period emphasizes recent market behavior but might miss longer-term patterns.
🧠 Prediction Method
Algorithm
Default: Lorentzian
Options: Lorentzian, KNN Pattern, Ensemble, EMA, Linear Regression
Selects the algorithm used for volume prediction:
Lorentzian: Uses Lorentzian distance metrics for pattern recognition, offering excellent noise resistance
KNN Pattern: Traditional K-Nearest Neighbors approach for historical pattern matching
Ensemble: Combines multiple methods with weighted averaging for robust predictions
EMA: Simple exponential moving average projection for trend-following predictions
Linear Regression: Projects future values based on linear trend analysis
Pattern Length
Default: 5
Range: 3-10
Defines the number of bars in each pattern for machine learning methods. Shorter patterns increase sensitivity to recent changes, while longer patterns may identify more complex structures but require more historical data.
Neighbors Count
Default: 3
Range: 1-5
Sets the K value (number of nearest neighbors) used in KNN and Lorentzian methods. Higher values produce smoother predictions by averaging more historical patterns, while lower values may capture more specific patterns but could be more susceptible to noise.
Prediction Horizon
Default: 5
Range: 1-10
Determines how many future bars to predict. Longer horizons provide more forward-looking information but typically decrease accuracy as the prediction window extends.
📊 Display Settings
Display Mode
Default: Overlay
Options: Overlay, Prediction Only
Controls how volume information is displayed:
Overlay: Shows both actual volume and predictions on the same chart
Prediction Only: Displays only the predictions without actual volume
Show Prediction Dots
Default: false
When enabled, adds white dots to future predictions for improved visibility and clarity.
Future Bar Transparency (%)
Default: 70
Range: 0-90
Controls the transparency of future prediction bars. Higher values make future bars more transparent, while lower values make them more visible.
📱 Dashboard Settings
Show Dashboard
Default: true
Toggles display of the prediction accuracy dashboard. When enabled, shows real-time accuracy metrics.
Dashboard Location
Default: Bottom Right
Options: Top Left, Top Right, Bottom Left, Bottom Right
Determines where the dashboard appears on the chart.
Dashboard Text Size
Default: Normal
Options: Small, Normal, Large
Controls the size of text in the dashboard for various display sizes.
Dashboard Style
Default: Solid
Options: Solid, Transparent
Sets the visual style of the dashboard background.
Understanding Accuracy Metrics
The dashboard provides key performance metrics to evaluate prediction quality:
Average Error
Shows the average difference between predicted and actual values
Positive values indicate the prediction tends to be higher than actual volume
Negative values indicate the prediction tends to be lower than actual volume
Values closer to zero indicate better prediction accuracy
Accuracy Percentage
A measure of how close predictions are to actual outcomes
Higher percentages (>70%) indicate excellent prediction quality
Moderate percentages (50-70%) indicate acceptable predictions
Lower percentages (<50%) suggest weaker prediction reliability
The accuracy metrics are color-coded for quick assessment:
Green: Strong prediction performance
Orange: Moderate prediction performance
Red: Weaker prediction performance
✅ Best Use Cases
Anticipate upcoming volume spikes or drops
Identify potential volume divergences from price action
Plan entries and exits around expected volume changes
Filter trading signals based on predicted volume support
Optimize position sizing by forecasting market participation
Prepare for potential volatility changes signaled by volume predictions
Enhance technical pattern analysis with volume projection context
⚠️ Limitations
Volume predictions become less accurate over longer time horizons
Performance varies based on market conditions and asset characteristics
Works best on liquid assets with consistent volume patterns
Requires sufficient historical data for pattern recognition
Sudden market events can disrupt prediction accuracy
Volume spikes may be muted in predictions due to normalization
💡 What Makes This Unique
Machine Learning Approach : Applies Lorentzian distance metrics for robust pattern matching
Algorithm Selection : Offers multiple prediction methods to suit different market conditions
Real-time Accuracy Tracking : Provides continuous feedback on prediction performance
Forward Projection : Visualizes multiple future bars with configurable display options
Normalized Scale : Presents volume as a percentage of maximum volume for consistent analysis
Interactive Dashboard : Displays key metrics with customizable appearance and placement
🔬 How It Works
The Volume Predictor processes market data through five main steps:
1. Volume Normalization:
Converts raw volume to percentage of maximum volume in lookback period
Creates consistent scale representation across different timeframes and assets
Stores historical normalized volumes for pattern analysis
2. Pattern Detection:
Identifies similar volume patterns in historical data
Uses Lorentzian distance metrics for robust similarity measurement
Determines strength of pattern match for prediction weighting
3. Algorithm Processing:
Applies selected prediction algorithm to historical patterns
For KNN/Lorentzian: Finds K nearest neighbors and calculates weighted prediction
For Ensemble: Combines multiple methods with optimized weighting
For EMA/Linear Regression: Projects trends based on statistical models
4. Accuracy Calculation:
Compares previous predictions to actual outcomes
Calculates average error and prediction accuracy
Updates performance metrics in real-time
5. Visualization:
Displays normalized actual volume with color-coding
Shows current and future volume predictions
Presents performance metrics through interactive dashboard
💡 Note:
The Volume Predictor performs optimally on liquid assets with established volume patterns. It’s most effective when used in conjunction with price action analysis and other technical indicators. The multi-algorithm approach allows adaptation to different market conditions by switching prediction methods. Pay special attention to the accuracy metrics when evaluating prediction reliability, as sudden market changes can temporarily reduce prediction quality. The normalized percentage scale makes the indicator consistent across different assets and timeframes, providing a standardized approach to volume analysis.
Accurate Bollinger Bands mcbw_ [True Volatility Distribution]The Bollinger Bands have become a very important technical tool for discretionary and algorithmic traders alike over the last decades. It was designed to give traders an edge on the markets by setting probabilistic values to different levels of volatility. However, some of the assumptions that go into its calculations make it unusable for traders who want to get a correct understanding of the volatility that the bands are trying to be used for. Let's go through what the Bollinger Bands are said to show, how their calculations work, the problems in the calculations, and how the current indicator I am presenting today fixes these.
--> If you just want to know how the settings work then skip straight to the end or click on the little (i) symbol next to the values in the indicator settings window when its on your chart <--
--------------------------- What Are Bollinger Bands ---------------------------
The Bollinger Bands were formed in the 1980's, a time when many retail traders interacted with their symbols via physically printed charts and computer memory for personal computer memory was measured in Kb (about a factor of 1 million smaller than today). Bollinger Bands are designed to help a trader or algorithm see the likelihood of price expanding outside of its typical range, the further the lines are from the current price implies the less often they will get hit. With a hands on understanding many strategies use these levels for designated levels of breakout trades or to assist in defining price ranges.
--------------------------- How Bollinger Bands Work ---------------------------
The calculations that go into Bollinger Bands are rather simple. There is a moving average that centers the indicator and an equidistant top band and bottom band are drawn at a fixed width away. The moving average is just a typical moving average (or common variant) that tracks the price action, while the distance to the top and bottom bands is a direct function of recent price volatility. The way that the distance to the bands is calculated is inspired by formulas from statistics. The standard deviation is taken from the candles that go into the moving average and then this is multiplied by a user defined value to set the bands position, I will call this value 'the multiple'. When discussing Bollinger Bands, that trading community at large normally discusses 'the multiple' as a multiplier of the standard deviation as it applies to a normal distribution (gaußian probability). On a normal distribution the number of standard deviations away (which trades directly use as 'the multiple') you are directly corresponds to how likely/unlikely something is to happen:
1 standard deviation equals 68.3%, meaning that the price should stay inside the 1 standard deviation 68.3% of the time and be outside of it 31.7% of the time;
2 standard deviation equals 95.5%, meaning that the price should stay inside the 2 standard deviation 95.5% of the time and be outside of it 4.5% of the time;
3 standard deviation equals 99.7%, meaning that the price should stay inside the 3 standard deviation 99.7% of the time and be outside of it 0.3% of the time.
Therefore when traders set 'the multiple' to 2, they interpret this as meaning that price will not reach there 95.5% of the time.
---------------- The Problem With The Math of Bollinger Bands ----------------
In and of themselves the Bollinger Bands are a great tool, but they have become misconstrued with some incorrect sense of statistical meaning, when they should really just be taken at face value without any further interpretation or implication.
In order to explain this it is going to get a bit technical so I will give a little math background and try to simplify things. First let's review some statistics topics (distributions, percentiles, standard deviations) and then with that understanding explore the incorrect logic of how Bollinger Bands have been interpreted/employed.
---------------- Quick Stats Review ----------------
.
(If you are comfortable with statistics feel free to skip ahead to the next section)
.
-------- I: Probability distributions --------
When you have a lot of data it is helpful to see how many times different results appear in your dataset. To visualize this people use "histograms", which just shows how many times each element appears in the dataset by stacking each of the same elements on top of each other to form a graph. You may be familiar with the bell curve (also called the "normal distribution", which we will be calling it by). The normal distribution histogram looks like a big hump around zero and then drops off super quickly the further you get from it. This shape (the bell curve) is very nice because it has a lot of very nifty mathematical properties and seems to show up in nature all the time. Since it pops up in so many places, society has developed many different shortcuts related to it that speed up all kinds of calculations, including the shortcut that 1 standard deviation = 68.3%, 2 standard deviations = 95.5%, and 3 standard deviations = 99.7% (these only apply to the normal distribution). Despite how handy the normal distribution is and all the shortcuts we have for it are, and how much it shows up in the natural world, there is nothing that forces your specific dataset to look like it. In fact, your data can actually have any possible shape. As we will explore later, economic and financial datasets *rarely* follow the normal distribution.
-------- II: Percentiles --------
After you have made the histogram of your dataset you have built the "probability distribution" of your own dataset that is specific to all the data you have collected. There is a whole complicated framework for how to accurately calculate percentiles but we will dramatically simplify it for our use. The 'percentile' in our case is just the number of data points we are away from the "middle" of the data set (normally just 0). Lets say I took the difference of the daily close of a symbol for the last two weeks, green candles would be positive and red would be negative. In this example my dataset of day by day closing price difference is:
week 1:
week 2:
sorting all of these value into a single dataset I have:
I can separate the positive and negative returns and explore their distributions separately:
negative return distribution =
positive return distribution =
Taking the 25th% percentile of these would just be taking the value that is 25% towards the end of the end of these returns. Or akin the 100%th percentile would just be taking the vale that is 100% at the end of those:
negative return distribution (50%) = -5
positive return distribution (50%) = +4
negative return distribution (100%) = -10
positive return distribution (100%) = +20
Or instead of separating the positive and negative returns we can also look at all of the differences in the daily close as just pure price movement and not account for the direction, in this case we would pool all of the data together by ignoring the negative signs of the negative reruns
combined return distribution =
In this case the 50%th and 100%th percentile of the combined return distribution would be:
combined return distribution (50%) = 4
combined return distribution (100%) = 10
Sometimes taking the positive and negative distributions separately is better than pooling them into a combined distribution for some purposes. Other times the combined distribution is better.
Most financial data has very different distributions for negative returns and positive returns. This is encapsulated in sayings like "Price takes the stairs up and the elevator down".
-------- III: Standard Deviation --------
The formula for the standard deviation (refereed to here by its shorthand 'STDEV') can be intimidating, but going through each of its elements will illuminate what it does. The formula for STDEV is equal to:
square root ( (sum ) / N )
Going back the the dataset that you might have, the variables in the formula above are:
'mean' is the average of your entire dataset
'x' is just representative of a single point in your dataset (one point at a time)
'N' is the total number of things in your dataset.
Going back to the STDEV formula above we can see how each part of it works. Starting with the '(x - mean)' part. What this does is it takes every single point of the dataset and measure how far away it is from the mean of the entire dataset. Taking this value to the power of two: '(x - mean) ^ 2', means that points that are very far away from the dataset mean get 'penalized' twice as much. Points that are very close to the dataset mean are not impacted as much. In practice, this would mean that if your dataset had a bunch of values that were in a wide range but always stayed in that range, this value ('(x - mean) ^ 2') would end up being small. On the other hand, if your dataset was full of the exact same number, but had a couple outliers very far away, this would have a much larger value since the square par of '(x - mean) ^ 2' make them grow massive. Now including the sum part of 'sum ', this just adds up all the of the squared distanced from the dataset mean. Then this is divided by the number of values in the dataset ('N'), and then the square root of that value is taken.
There is nothing inherently special or definitive about the STDEV formula, it is just a tool with extremely widespread use and adoption. As we saw here, all the STDEV formula is really doing is measuring the intensity of the outliers.
--------------------------- Flaws of Bollinger Bands ---------------------------
The largest problem with Bollinger Bands is the assumption that price has a normal distribution. This is assumption is massively incorrect for many reasons that I will try to encapsulate into two points:
Price return do not follow a normal distribution, every single symbol on every single timeframe has is own unique distribution that is specific to only itself. Therefore all the tools, shortcuts, and ideas that we use for normal distributions do not apply to price returns, and since they do not apply here they should not be used. A more general approach is needed that allows each specific symbol on every specific timeframe to be treated uniquely.
The distributions of price returns on the positive and negative side are almost never the same. A more general approach is needed that allows positive and negative returns to be calculated separately.
In addition to the issues of the normal distribution assumption, the standard deviation formula (as shown above in the quick stats review) is essentially just a tame measurement of outliers (a more aggressive form of outlier measurement might be taking the differences to the power of 3 rather than 2). Despite this being a bit of a philosophical question, does the measurement of outlier intensity as defined by the STDEV formula really measure what we want to know as traders when we're experiencing volatility? Or would adjustments to that formula better reflect what we *experience* as volatility when we are actively trading? This is an open ended question that I will leave here, but I wanted to pose this question because it is a key part of what how the Bollinger Bands work that we all assume as a given.
Circling back on the normal distribution assumption, the standard deviation formula used in the calculation of the bands only encompasses the deviation of the candles that go into the moving average and have no knowledge of the historical price action. Therefore the level of the bands may not really reflect how the price action behaves over a longer period of time.
------------ Delivering Factually Accurate Data That Traders Need------------
In light of the problems identified above, this indicator fixes all of these issue and delivers statistically correct information that discretionary and algorithmic traders can use, with truly accurate probabilities. It takes the price action of the last 2,000 candles and builds a huge dataset of distributions that you can directly select your percentiles from. It also allows you to have the positive and negative distributions calculated separately, or if you would like, you can pool all of them together in a combined distribution. In addition to this, there is a wide selection of moving averages directly available in the indicator to choose from.
Hedge funds, quant shops, algo prop firms, and advanced mechanical groups all employ the true return distributions in their work. Now you have access to the same type of data with this indicator, wherein it's doing all the lifting for you.
------------------------------ Indicator Settings ------------------------------
.
---- Moving average ----
Select the type of moving average you would like and its length
---- Bands ----
The percentiles that you enter here will be pulled directly from the return distribution of the last 2,000 candles. With the typical Bollinger Bands, traders would select 2 standard deviations and incorrectly think that the levels it highlights are the 95.5% levels. Now, if you want the true 95.5% level, you can just enter 95.5 into the percentile value here. Each of the three available bands takes the true percentile you enter here.
---- Separate Positive & Negative Distributions----
If this box is checked the positive and negative distributions are treated indecently, completely separate from each other. You will see that the width of the top and bottom bands will be different for each of the percentiles you enter.
If this box is unchecked then all the negative and positive distributions are pooled together. You will notice that the width of the top and bottom bands will be the exact same.
---- Distribution Size ----
This is the number of candles that the price return is calculated over. EG: to collect the price return over the last 33 candles, the difference of price from now to 33 candles ago is calculated for the last 2,000 candles, to build a return distribution of 2000 points of price differences over 33 candles.
NEGATIVE NUMBERS(<0) == exact number of candles to include;
EG: setting this value to -20 will always collect volatility distributions of 20 candles
POSITIVE NUMBERS(>0) == number of candles to include as a multiple of the Moving Average Length value set above;
EG: if the Moving Average Length value is set to 22, setting this value to 2 will use the last 22*2 = 44 candles for the collection of volatility distributions
MORE candles being include will generally make the bands WIDER and their size will change SLOWER over time.
I wish you focus, dedication, and earnest success on your journey.
Happy trading :)
Intellect_city - Halvings Bitcoin CycleWhat is halving?
The halving timer shows when the next Bitcoin halving will occur, as well as the dates of past halvings. This event occurs every 210,000 blocks, which is approximately every 4 years. Halving reduces the emission reward by half. The original Bitcoin reward was 50 BTC per block found.
Why is halving necessary?
Halving allows you to maintain an algorithmically specified emission level. Anyone can verify that no more than 21 million bitcoins can be issued using this algorithm. Moreover, everyone can see how much was issued earlier, at what speed the emission is happening now, and how many bitcoins remain to be mined in the future. Even a sharp increase or decrease in mining capacity will not significantly affect this process. In this case, during the next difficulty recalculation, which occurs every 2014 blocks, the mining difficulty will be recalculated so that blocks are still found approximately once every ten minutes.
How does halving work in Bitcoin blocks?
The miner who collects the block adds a so-called coinbase transaction. This transaction has no entry, only exit with the receipt of emission coins to your address. If the miner's block wins, then the entire network will consider these coins to have been obtained through legitimate means. The maximum reward size is determined by the algorithm; the miner can specify the maximum reward size for the current period or less. If he puts the reward higher than possible, the network will reject such a block and the miner will not receive anything. After each halving, miners have to halve the reward they assign to themselves, otherwise their blocks will be rejected and will not make it to the main branch of the blockchain.
The impact of halving on the price of Bitcoin
It is believed that with constant demand, a halving of supply should double the value of the asset. In practice, the market knows when the halving will occur and prepares for this event in advance. Typically, the Bitcoin rate begins to rise about six months before the halving, and during the halving itself it does not change much. On average for past periods, the upper peak of the rate can be observed more than a year after the halving. It is almost impossible to predict future periods because, in addition to the reduction in emissions, many other factors influence the exchange rate. For example, major hacks or bankruptcies of crypto companies, the situation on the stock market, manipulation of “whales,” or changes in legislative regulation.
---------------------------------------------
Table - Past and future Bitcoin halvings:
---------------------------------------------
Date: Number of blocks: Award:
0 - 03-01-2009 - 0 block - 50 BTC
1 - 28-11-2012 - 210000 block - 25 BTC
2 - 09-07-2016 - 420000 block - 12.5 BTC
3 - 11-05-2020 - 630000 block - 6.25 BTC
4 - 20-04-2024 - 840000 block - 3.125 BTC
5 - 24-03-2028 - 1050000 block - 1.5625 BTC
6 - 26-02-2032 - 1260000 block - 0.78125 BTC
7 - 30-01-2036 - 1470000 block - 0.390625 BTC
8 - 03-01-2040 - 1680000 block - 0.1953125 BTC
9 - 07-12-2043 - 1890000 block - 0.09765625 BTC
10 - 10-11-2047 - 2100000 block - 0.04882813 BTC
11 - 14-10-2051 - 2310000 block - 0.02441406 BTC
12 - 17-09-2055 - 2520000 block - 0.01220703 BTC
13 - 21-08-2059 - 2730000 block - 0.00610352 BTC
14 - 25-07-2063 - 2940000 block - 0.00305176 BTC
15 - 28-06-2067 - 3150000 block - 0.00152588 BTC
16 - 01-06-2071 - 3360000 block - 0.00076294 BTC
17 - 05-05-2075 - 3570000 block - 0.00038147 BTC
18 - 08-04-2079 - 3780000 block - 0.00019073 BTC
19 - 12-03-2083 - 3990000 block - 0.00009537 BTC
20 - 13-02-2087 - 4200000 block - 0.00004768 BTC
21 - 17-01-2091 - 4410000 block - 0.00002384 BTC
22 - 21-12-2094 - 4620000 block - 0.00001192 BTC
23 - 24-11-2098 - 4830000 block - 0.00000596 BTC
24 - 29-10-2102 - 5040000 block - 0.00000298 BTC
25 - 02-10-2106 - 5250000 block - 0.00000149 BTC
26 - 05-09-2110 - 5460000 block - 0.00000075 BTC
27 - 09-08-2114 - 5670000 block - 0.00000037 BTC
28 - 13-07-2118 - 5880000 block - 0.00000019 BTC
29 - 16-06-2122 - 6090000 block - 0.00000009 BTC
30 - 20-05-2126 - 6300000 block - 0.00000005 BTC
31 - 23-04-2130 - 6510000 block - 0.00000002 BTC
32 - 27-03-2134 - 6720000 block - 0.00000001 BTC