- wyatt8240
- Jul 1
- 13 min read
Have you ever wished you had like a genuine crystal ball for the economy?
Ah, wouldn't that be something?
Seriously though, a real shortcut to understanding those big market moving macroeconomic data points. You know, getting a clear picture before almost anyone else.
Yeah.
Imagine knowing what was likely coming, maybe even why, well before those big economic announcements hit. The ones that can really send markets spinning.
Well, that's pretty much exactly what we're going to dig into. to today.
Okay,
this deep dive is all about how a new generation of macroeconomic forecasts is delivering uh really unparalleled accuracy and importantly lead time for critical economic indicators.
Right.
We're going to focus specifically on the innovative approaches pioneered by a company called Exponential Technologies or you might hear them called Xtech.
Xtech. Got it.
Our mission really is to unpack how these um cutting edge predictions are actually built.
What makes them so unique in what's, let's face it, a crowded field
and crucially why they matter so much for anyone really who's looking to be truly wellinformed about the economy
and to do that we've uh we've really gone through a stack of their material research documents product descriptions performance data all straight from extec
yeah it's quite comprehensive
so it's a proper deep exploration into well the cutting edge of economic foresight
exactly
okay so before we jump into the new stuff the ext approach let's set the scene a bit what's the um the usual landscape for economic forecasting. What have people typically relied on, right?
And you know, where do those methods often fall short if you're someone who needs to make fast, informed decisions?
Yeah, good question. So, traditionally, institutional investors, big players, have leaned very heavily on what are called broker consensus estimates.
Okay, the consensus,
right? And these are often derived from surveys of economists, you know, asking a bunch of experts what they think or they rely on economic indicator that are well by their nature lagging. They tell you what has happened, not necessarily what's about to happen.
Looking backwards
precisely.
And a really significant challenge with these traditional methods is something called revision bias.
Revision bias.
What's that exactly?
So that initial data release, yeah,
the number that hits the screens and that markets react to instantly.
Yeah.
It frequently gets revised later, sometimes days, weeks, even months later.
Oh, okay.
So this raises a pretty Important question if you're trying to understand market movements.
Yeah.
How can you truly grasp why a market reacted the way it did if the actual data it was reacting to changes after the fact?
That's the core of it, isn't it? It means you're almost always playing catchup.
You are.
If your whole strategy is based on that first release or just the general consensus view, you're inherently a step behind. It's like uh the ultimate hindsight is 2020 problem but for real-time market decisions.
Couldn't have put it better myself. That's the fundamental issue.
Okay. So, if traditional methods leave you consistently playing catchup, looking in the rearview mirror, what's XTEC doing differently, how are they offering this uh sharp alternative?
Well, it really helps to understand the background of Morgan Slate. He's the CEO of XTE. He brings, get this, 25 years of institutional investing experience. He's held senior roles at big names like Citadel and Meil Lynch.
Wow. Okay. Real world experience.
Absolutely. And his academic background is in in engineering but with a specialty in finance from MIT. So a really strong quantitative foundation. He's basically been a pioneer in applying AI tools to pull out investment signals from really complex data.
That combination makes sense for this.
It does. And you see that deep expertise reflected right in Xteexc's core methodology. They're leveraging, you know, cutting edge indicators, proprietary alternative data stuff others don't have access to and AI to provide genuinely early pred predictions of these key macro announcements.
Early predictions. Okay. How does that work in practice?
Well, a key difference is their predictions are built from the bottom up.
Bottom up.
Yeah. Meaning they use entirely independent data sources. It's a unique approach, completely separate, orthogonal is the technical term, to those standard broker forecasts based on surveys.
So, they're not just polling economists.
Not at all. Instead, they're using advanced statistical modeling, machine learning, processing just vast amounts of data, both public and their own proprietary stuff, to build a completely new bottomup model for every single macroeconomic metric they track.
That sounds incredibly data inensive.
It is. And critically, it involves ingesting highfrequency, real-time data, stuff that's happening now. And even forward-looking survey responses, not just backward-looking indicators,
right? Getting ahead of the curve.
Exactly. And this leads us to a really interesting technique they use in their modeling called teacher forcing.
Teacher forcing. Okay. What does that mean?
So, think of it like this. Imagine you're learning something something new. Teacher forcing is like having a tutor who gives you the correct answer immediately after you make a guess rather than letting you continue down the wrong path based on your own potentially flawed understanding.
Ah okay. So constant correction based on reality.
Precisely. The model is constantly fed the most recent actual data, the ground truth to correct itself. It doesn't just rely on its own previous forecasts which might have drifted from reality.
So it's always learning from what actually happened very recently.
That's The key, it ensures the model is always tuned to the very latest economic reality, making its predictions incredibly adaptive and uh dynamic. It learns fast.
That's a great way to explain it. Constant learning, constant adaptation. So, okay, what does all this sophisticated modeling actually mean for the data they put out? What specific economic indicators is Xtech forecasting? And you know, the big questions, how early and how accurately,
right? Let's get into the specific Their first release focuses on some really key US macroeconomic indicators. Things like the US consumer price index CPI, which everyone watches,
of course, inflation central.
Exactly. Then there's the Michigan Consumer Sentiment Index, the Conference Board Consumer Confidence Index. Both big measures of how consumers are feeling,
important for spending,
and US retail sales. Yeah.
But what's really fascinating, I think, is that they don't just forecast the headline CPI number. They provide forecasts for individual CPI categories.
Oh, interesting. So, it down.
Yeah. And this is where the detail becomes incredibly valuable if you want a nuanced understanding of what's actually driving inflation. Yeah.
We're talking categories like shelter housing costs, which is, you know, a massive 35% chunk of the CPI. Sure.
Commodities at 19%, food at 14%, and even the more volatile stuff like gasoline at 3%, medical costs at 7%, education at 5%. Getting early signals on those specific areas is powerful.
Absolutely. Knowing where the inflation is coming from. Okay. So that's the what what about the when? How early are these forecasts?
The timeliness is uh pretty remarkable. Take their main CPI first forecast. It's released on the third Monday of the current month.
Of the current month for that month's data.
Yes. Which means it's out more than 3 weeks ahead of the official government CPI release.
Three weeks. Wow.
And here's the kicker.
Yeah.
That's before most traditional brokers even start putting out their predictions.
So you're getting this insight before the consensus even forms.
Way before. And for another example, a conference board consumer confidence index forecast that comes out on the 15th day of the current month. Again, roughly 2 weeks ahead of the official release.
Okay, that lead time is significant, but lead time is only good if it's accurate, right? How do the numbers stack up on accuracy,
right? The million-dollar question. And the accuracy metrics are uh quite compelling. These are based on averages looking back from November 2017 all the way to April 2025. So, good long period.
Okay,
for CPI, their forecasts are are available as we said 24 days before the official release on average and they have what they call a hit rate of 36%.
Hit rate. What does that mean exactly?
That means their forecast is within plus or minus three decimal places 0003 of the official value over a third of the time.
Okay. And how does that compare?
Well, compared to professional economist consensus forecasts, XX predictions are available on average 12 days earlier. And the data shows they're generally more accurate, too.
12 days earlier and more accurate. That's quite a claim
it is. And we can look at some specific metrics for the overall CPI first forecast, the month-over-month percentage change, they show an 82% correlation with the actual number.
82% correlation. That's strong.
Very strong. Directional accuracy, predicting whether CPI went up or down compared to the previous forecast, is 75%. And sign accuracy, just predicting whether the change was positive or negative, is 94%.
94% sign accuracy. So nearly every single time They correctly predict if inflation is going up or down month.
That's what the historical data suggests for that period. Yes.
Okay, that's impressive. What about those volatile categories you mentioned like gasoline?
Yeah, good point. You'd expect those to be harder to predict, right?
Definitely.
But for their gasoline category first forecast, month-over-month percentage change, the numbers are actually even higher. Correlation 96%.
96. Wow.
Directional accuracy 87%. And sign accuracy 97%.
Incredible, especially for something as jumpy as gas prices.
It really highlights the power of their methodology using that real time bottomup data. You know, for anyone managing logistics, supply chains, or even just budgeting for their commute, that kind of consistent foresight for gasoline isn't just data. It's a massive strategic advantage.
No kidding. And what about consumer confidence?
For the Conference Board Consumer Confidence Index, looking at the index level prediction, they show an 89% correlation and 71% directional accuracy. Still very strong numbers.
Those are certainly compelling figures, especially like you said when you factor in how early these forecasts land for someone who's, you know, used to seeing those official numbers come out and then get revised later,
right?
What are some of the typical challenges or maybe misses that Xtech has observed? How do they keep refining the models? Because no model is perfect, right?
Absolutely not. And that's a great question because continuous refinement is absolutely key. That teacher forcing mechanism we talked about, that's specifically designed for this. Yeah, when a forecast does deviate from the actual official release, the model immediately learns from that discrepancy. It adjusts its internal workings, its parameters based on that new truth.
So, it learns from its mistakes essentially instantly
pretty much. Yeah. Now, sometimes you get unforeseen external shocks. Think major geopolitical events, sudden supply chain implosions, things like that.
Black swan events
kind of those can introduce volatility that even the most advanced models might struggle with initially. Of course,
sure.
But by incorporating new data so rapidly and continuously learning, the models adapt. They improve their performance for future similar events. It's this iterative process, constantly back testing, refining against real world outcomes that really drives that sustained accuracy over time.
That dynamic adaptation really does underline the difference from more static traditional models.
Okay, so let's pull back the curtain a bit more. How does Xtech actually do this? How do they achieve such granular forward-looking insights? what's the practical uh behind the-scenes methodology? Okay. Yeah, let's walk through it. It's a rigorous multi-step process. We can use CPI as the example.
Perfect.
So, step one, they identify what they call surrogate CPI features. These are basically other economic indicators that have shown a strong influence on the individual components within the CPI.
Finding the drivers for each piece.
Exactly. Step two, they gather and prepare just massive amounts of relevant data inputs and they perform something called feature engineering.
Feature engineering.
It's about transforming that raw data into specific features, specific inputs that are designed to maximize the predictive accuracy of the models. It's a crucial step.
Okay. Prepping the data smartly.
Right. Step three, they actually forecast the individual CPI components, shelter, food, energy, etc. Modeling and predicting the level of each one using approaches tailored specifically to that component.
Not a one-sizefits-all model.
Definitely not. Step four is constant evaluation. and refinement. They're always analyzing performance metrics, seeing how the models did and iterating to make improvements based on new data and how the models performed previously.
Like that teacher forcing loop.
Exactly. And then finally, step five, they aggregate all those individual component forecasts together to produce the overall CPI month- over-month percentage change forecast. The headline number everyone waits for.
Wow. Okay. That is genuinely bottom up. Building it piece by piece.
It is. And this whole full robust process also includes doing extensive correlation studies. That's how they identify those potential leading macro features in the first place.
Ah finding things that predict other things,
right? For example, they found strong correlations between things like average hourly earnings and the future CPI or private payroll numbers and future unemployment rates. And crucially, they use their own proprietary used car prices data which often gives a very early signal for inflation trends in that important sector feeding into the broader CPI
having proprietary data must be a big edge.
It absolutely can be. Another vital element, and this is really interesting, is their use of point in time economic data.
Point in time. Okay. What's that about?
It means that if you're using their platform or data feed, you can look back and see any historical indicator exactly as it appeared at any specific moment you choose in the past.
So, not the finally revised number, but what people actually saw on that day.
Exactly. It captures the initial release and all the subsequent revisions. This is absolutely critical because it eliminates that revision bias we talked about earlier when you're doing back testing or building your own models.
Ah, I see. So, you're testing your strategy against the information that was actually available at the time decisions were being made.
Precisely. It reflects what markets and policymakers truly knew or thought they knew at that specific moment.
And just for context, their US CPI data using this point in time approach goes all the way back to 1913.
Wow, that's incredible historical dip. It really is. They also provide extensive accuracy diagnostics with the data.
Things like confidence intervals around their forecasts and metrics like th's statistics.
Thu, what does that tell you?
So, Dell's U is a pretty powerful tool. It doesn't just tell you if a forecast was, you know, right or wrong in magnitude. It helps explain why it might have differed from the actual outcome was the error due to bias in the model or variance or co-variance. It gives users a much deeper understanding of the forecast strengths. and potential weaknesses.
So you don't just get the number, you get context on its reliability.
Exactly. And on top of all that, the data they provide also includes insights from aggregated economist predictions, the traditional consensus view. But not just the average consensus number, they also show the dispersion of those forecasts.
Ah, so how much disagreement there is among the economists,
right? Which is a great way to understand the level of market uncertainty around a particular release
that comprehensive diagnostic capability. sounds incredibly valuable for actually using these forecasts effectively for understanding them.
It really is. It it builds trust and allows for more sophisticated use.
Okay, so we've covered the what the forecasts the data and the h the bottomup modeling teacher forcing point in time data. Let's bring it back to you the listener. If you're tracking the economy, maybe managing investments or just trying to stay genuinely ahead of the curve, why is this kind of foresight truly invaluable in practice? What does it let you do?
Fundamentally, the core value here is simple but profound. It's foresight, not hindsight.
Foresight, not hindsight. I like that.
This advanced insight enables users to proactively anticipate macroeconomic shifts rather than just reacting after the fact.
Okay, anticipate. How? Give us some examples.
Well, think about how this can be applied. You can strategically position your investment portfolio ahead of major macroeconomic releases, aiming to capture those potential market moving events.
Getting positioned before the news. breaks.
Exactly. It can significantly enhance trading strategies. You can integrate these forecasts into, say, directional trades, betting on market direction or relative value trades, comparing related assets. It allows for potentially more precise entries and exits.
Makes sense.
You could even potentially identify and capitalize on market mispricings, situations where the traditional consensus estimates seem out of line with what Xtec's more datadriven early forecast is suggesting.
Finding those discrepancies,
right? Furthermore, it allows you to actively manage risk exposure. If the forecast suggests a shift in inflation or growth or consumer spending is coming, you can adjust your portfolio allocations in anticipation of that change.
Proactive risk management.
Absolutely. And it improves market timing overall by integrating these predictive insights into your entry and exit strategies across different asset classes. Think commodities, fixed income, equities, even foreign exchange.
So, applications across the board
pretty much. Ultim Certainly, this isn't just about making predictions like some kind of oracle. It's about providing an independent, indispensable source of macroeconomic market intelligence, especially for active decision makers, active risktakers, and also riskmanagement functions within larger institutions.
An independent check on the consensus,
a very early datadriven independent check. It allows users to essentially get ahead of unexpected market shifts, those surprises or disappointments relative to consensus. and maybe even be positioned to be a liquidity provider to step in calmly when others are reacting perhaps impulsively to sudden news because they didn't see it coming.
That certainly paints a compelling picture of the potential value for a listener, maybe someone running their own portfolio or advising clients, how quickly can they actually integrate this kind of dynamic forward-looking insight? Is it like, you know, an easy plugandplay thing or does it require a big shift in how they operate?
That's a practical point. It's actually designed to be quite adaptable for the more quantitative users. The data is easily integrated into their existing models and algorithms, often through APIs, application programming interfaces, so it can feed directly into automated system.
Okay, so the quants can plug it right in
pretty much. But for those who are more qualitative, more discretionary in their approach, the early and accurate forecasts provide a really vital independent viewpoint, something to challenge their existing assumptions or to inform strategic discussions well before the traditional data points even land.
So it serves both styles.
Yes. Now look, any new data source requires some familiarization. Of course, you need to understand its nuances.
Sure. Learning curve
a bit, but the value proposition of having that genuine early foresight often outweighs the integration effort pretty quickly simply because it fundamentally shifts the information advantage in your favor.
It's clear that these advanced forecasts really do offer a significant edge for anyone trying to navigate the let's face it, complexities of the global economy. It provides that clarity and speed that uh the informed learner, our listener, often craves.
Indeed. And maybe a final thought to leave people with
in a world that's increasingly overwhelmingly driven by data. The true competitive edge may no longer be about simply having more information than the next person. It might actually be about having earlier and more accurate information.
Earlier and more accurate.
Yeah. And that fundamentally shifts how we understand and how we can react to the economic landscape. So the question for you is, what new strategies might you consider if you consistently had this kind of foresight at your fingertips?
A powerful thought to ponder. Definitely something to reflect on regarding the implications of that kind of predictive power, whether it's for your own financial decisions or just your broader understanding of the global economy.
Comentários