- wyatt8240
- Jul 1
- 10 min read
Imagine for a moment having access to well a truly predictive lens on the economy. Not some kind of mythical crystal ball, but a real tangible way to see major economic announcements coming weeks before everyone else giving you a huge informationational edge.
Yeah, for decades that kind of foresight that's been the uh the elusive holy grail for anyone in the markets, right?
But, you know, breakthrough technologies are actually making it a reality now. We're shifting from just reacting to well anticipating.
That's exactly what we're diving into today. These groundbreaking global macroeconomic forecasts, they're developed by Else in partnership with Exponential Technologies or uh Xtech,
right?
And our deep dive today is pulling insights straight from their, you know, detailed product info, technical white papers, performance data, all to show how these forecasts give you that significant edge.
Yeah, our mission here is really to unpack their unique approach, look at the cutting edge tech behind it all, and most importantly, figure out what this means for you. especially if you're aiming to get that strategic advantage in understanding these market moving economic events.
Okay, this level of foresight sounds almost revolutionary.
So how is it actually achieved? Let's start with the team behind it. This partnership Else and Xtech.
It's a fascinating combination actually. Their strengths really complement each other. How
so?
Well, you've got Exponential Technologies Xtech led by CEO Morgan Slade. He brings the advanced analytics, the data tech side.
Okay.
And Morgan's background is pretty impressive. We're talking 25 years in institutional investing, Citadel, Meil Lynch, places like that, plus an MIT engineering background.
Wow.
Yeah. He's really a pioneer in using AI to find investment signals, create these kinds of data products.
And Else's role in this.
Else brings the foundation worldleading historical point in time economic data. You know, Pit data and also robust consensus economic estimates.
Got it. So, Xteex AI and modeling fueled by Elsa eggs deep authoritative data.
Exactly. It's that combination, that combined power that lets them say they're producing, you know, the most accurate CPI forecast in the world and often weeks ahead of the traditional estimates.
Okay, let's get specific then. What exactly are they forecasting in this first release and why these particular indicators? Why do they matter so much to you, the listener?
So, this first release focuses on key US indicators, the ones that consistently move markets,
like the US consumer prices. index CPI that measures basically the change in prices consumers pay for everyday goods and services
inflation essentially
right then there's the Michigan consumer sentiment index and the conference board consumer confidence index both are really crucial gauges of how optimistic consumers are feeling which affects their spending
okay sentiment and confidence
and finally US retail sales that tracks total sales from retail businesses gives you a snapshot of consumer demand right now
that's a pretty powerful set of indicators
but you mentioned something earlier that they forecast individual CPI categories. That sounds like a big deal. Why is that detail so important?
Oh, it's critical because the headline CPI number, yeah, it's important, but it's an average, an aggregate,
right?
Understanding the individual bits like shelter, energy, food, lets you see what's actually driving inflation. Is it housing costs spiking or maybe gas prices? It gives you a much more nuanced, precise view that the big number just can't.
That makes a lot of sense. So this bottomup approach, it really sounds fundamentally different. Can you walk us through the actual method, the uh the unique approach that sets it apart from how things are usually done?
Certainly, they use this bottom-up modeling and they describe it as orthogonal to traditional broker forecasts.
Orthogonal. What does that mean in this context?
It basically means they're not just using a slightly different model. They're fundamentally sidestepping the built-in biases and importantly the lag you get with traditional methods like surveys or older econom metric models.
So, not relying on potentially stale info.
Exactly. They're building it from the ground up using granular highfrequency realtime data. They combine that with Else's point in time historical data and their own proprietary alternative data. And then AI and machine learning power the whole thing. It's a real shift in philosophy for macro forecasting.
Okay. So, they're constructing the picture from fresh, more responsive data sources.
Can you give us a sense of the actual process? They for something really key like the CPI forecast, how do the steps break down?
Sure. It's essentially a five-step process. First, they identify what they call surrogate CPI features.
Surrogate features.
Yeah. These are other economic indicators that strongly influence specific bits of the CPI. Think about things like average hourly earnings and how that might link to certain price pressures or private payrolls and unemployment data. Even proprietary data like tracking used car prices which directly correlates with that CPI category.
Okay. So, finding those specific predictive links. Very insightful.
Very. Step two is gathering and meticulously prepping all this diverse data. And this includes something called feature engineering.
It's kind of the art and science of turning that raw data into the signals that are most predictive for the AI models, making the data speak clearly, if you will. Got it.
Then step three, they forecast the individual CPI components, often using slightly different tailored approaches for each one. Step four is crucial constant evaluation and refinement. They're always tweaking the models, making them better,
continuous improvement,
right? And finally, step five, they aggregate all those individual forecasts together to get the overall CPI, month-over-month percentage change number.
I read about this technique they use called teacher forcing. Sounds fascinating. What is that and why is it important here?
Yeah, teacher forcing or sometimes called one step ahead forecasting. It's quite clever. Basically, it means the model is constantly learning from the very latest actual data point that comes in
instead of just relying on its own previous guess.
Exactly. Rather than potentially drifting off course based on its own past forecast which might have errors built up, it gets corrected or forced back towards reality by the newest actual number. It's like real time calibration.
That sounds like a really powerful way to keep the model accurate and grounded in well reality.
It is. It ensures the predictions are always tied to the latest ground truth, not just echoing prior assump. ions,
which must be a big reason for their accuracy claims and the performance data seems to back that up. This is where you really see the edge, right? How do they stack up on accuracy and crucially lead time?
Well, the lead time alone is frankly a gamecher. These LEG forecasts, they're available up to 25 days before the official CPI release.
25 days?
Yeah, compare that. That's weeks before most broker estimates even start trickling out. And on average, it's about 12 days earlier than the typical economist prediction.
That's not just a small edge. That's a massive head start.
It really is a significant leap in foresight.
Okay. And the accuracy claims are bold, most accurate CPI forecast in the world. How do they actually back that up? What do the numbers show?
They have independent validation and it's pretty robust. So for the main CPI month- over-month percentage change, that first forecast, the one that comes out nearly a month ahead, yeah,
it shows an 82% correlation with the actual number,
82%
and 75% directional accuracy, meaning It gets the up or down movement right threequarters of the time. Plus 94% sign accuracy. So it nails whether the change is positive or negative almost perfectly.
That's impressive for a forecast that early. What about the later one?
The second forecast which comes out about a week before the official release and uses even more recent data gets even better. Correlation bumps up to 84% directional accuracy to 80%. Still maintains that 94% sign accuracy.
And for the index level itself,
even stronger for the actual CPI index level forecasts. But Both the first and second show correlation of get this 99.99%. And directional accuracy of 92%.
Wow. Okay. Those numbers are exceptionally strong. For someone using this, how does that precision translate into say strategic or training advantages?
Well, think about a really volatile component like gasoline CPI. It can really swing the main number.
Right. It's often excluded from core inflation for that reason.
Exactly. But their first forecast for gasoline CPI shows a 96% correlation. 87% % directional accuracy and 97% scan accuracy.
96% correlation on gasoline nearly a month out.
Yep. So that granularity lets you anticipate big moves in a key volatile component even if it's stripped out of the core number everyone watches.
Okay. And other indicators like consumer confidence.
Similar story. Their conference board consumer confidence index forecast that's out about 2 weeks ahead of the official release. It shows an 89% correlation and 71% directional accuracy.
So it's not just early. It's consistently accurate across different types of indicators, even the volatile ones. You mentioned a hit rate, too. What does that refer to?
Right. The hit rate. For the LEG CPI forecasts, they achieve a 36% hit rate. That means they match the official CPI value within a very tight band plus or minus just three decimal places.
And this is
on average 24 days before the official release.
Okay, hitting the number that precisely that far in advance, that's extraordinary.
It really highlights the degree of precision they're achieving given how dynamic these things are and the lead time involved.
These granular insights seem so important, especially when you think about how the overall CPI is actually put together.
Can you remind us of the waiting of those individual categories? Why does understanding them matter so much?
Absolutely. It's key to remember the headline CPI isn't monolithic. It's a weighted average reflecting how consumers spend their money. Right.
So shelter, for example, is the biggest chunk by far. About 35% of the index.
A third of it roughly.
Yeah. Then you've got commodities, X food and energy, that's around 19%, food is about 14%. Transportation is 6%, medical services 7%. Education and communication around 5%.
Smaller pieces, but they add up.
They do. And then recreation is about 3%, gasoline specifically is also around 3%, electricity 2%, other utilities about 1%. And a small other category.
And like you said before, forecasting something specific like gasoline even though it's only 3% is crucial because it's swings can really impact the main number people react to.
Exactly. That volatility matters. Understanding those underlying moving parts is how you truly anticipate inflation trends and their impact rather than just, you know, reacting after the fact to the single big number.
Okay. So, let's bring this home. For the sophisticated professional, the person navigating these markets day in day out.
Yeah.
How does this capability truly change the game for them? What's the practical difference this makes?
Well, fundamentally, it shifts your entire strategy from being reactive to being proactive. How so? In terms of investment strategy,
it allows for genuine pre-positioning. You can adjust your portfolio before the macroeconomic release hits the wires, potentially capturing alpha from market moves before prices fully adjust. You can enhance your trading directional bets, relative value trades. And you can spot and maybe even arbitrage market mispricings when the consensus forecasts are off base compared to this earlier, more accurate data.
So, finding opportunities before the crowd even knows they exist. What about from a riskmanagement perspective? Huge implications there, too. This foresight lets you adjust your portfolio allocations proactively. If you see signs of inflation picking up or growth slowing or consumer spending changing based on these forecasts, you can tweak your risk exposure before it becomes a problem. It enhances your whole risk framework.
Makes sense.
And beyond direct trading or risk adjustment, how does it help with broader strategic planning or analysis?
Oh, it's transformative for that, too. You can optimize your asset allocation decisions, equities, income, commodities, FX. With a much clearer forward view, you can improve your market timing for getting in or getting out. It provides a really solid foundation for fundamental research and developing strategies based on forward-looking macro models, not just backward-looking data. And critically, it helps you anticipate surprises.
Surprises versus the consensus.
Exactly. If you see a number coming that's likely to shock the market relative to the consensus view, you can preposition. You might even be able to act as a liquidity provider during those temporary market dislocations caused by the surprise. It's really about shifting from analyzing hindsight to actually leveraging foresight.
And underpinning all of this foresight, all this modeling is the data itself. You mentioned Else's core data assets. Can you expand on that? Particularly the point in time data.
Yes, that's absolutely fundamental. Else point in time economic data, the PIT data is crucial. What it does is let you see any historical indicator exactly as it looked at a specific moment in the past. So not just the final revised number we see today.
Precisely. It captures the initial release number and all the subsequent revisions that happened over time. Economic data gets revised frequently after it first comes out.
Right. Sometimes significantly. So this solves the problem of revision bias when you're testing models.
Exactly. It eliminates that revision bias in back testing and model building. It means you're assessing your forecasting models based on the actual information the market had at that specific point in time. It makes the analysis much more realistic and accurate.
That sounds invaluable. And the depth of this data,
it's incredible. The US CPI data goes back to 1913. GDP figures back to 1947. Just immense historical context.
Wow. And they also use the US Reuters economic polls data.
Yes, that's another important piece. It provides the consensus forecast from a wide range of economists, but importantly, it also shows the dispersion in those forecasts.
Dispersion meaning how spread out the economist's guesses are.
Exactly. It helps you gauge the level of uncertainty around a particular forecast. If all the economists are tightly clustered, there's high confidence. If they're all over the place, there's a lot of uncertainty. That's another vital layer of insight for making decisions.
What a deep dive this has been. Really from that initial idea of a predictive lens through the nuts and bolts of the methodology, the partnership, the data, and then seeing that remarkable performance, it really shows how advanced analytics and deep data are changing the game in understanding economic shifts.
It's a Powerful demonstration, isn't it? This collaboration between Else and Xtech shows how sophisticated tech is making something that seemed almost impossible, well, a reality offering a really significant, distinct advantage in navigating today's pretty complex markets.
So, just consider for a moment, think about how powerful having a more accurate, much earlier look at critical economic data could be for any decision-making process you're involved in, whether that's managing investments, shaping business strategy, or even just trying to get a clearer picture of the economic currents shaping our world.
This isn't just about getting better numbers slightly faster. It feels more fundamental. It's about shifting perspective, moving from hindsight to true foresight. And the real question is, what new opportunities might that shift unlock for you?
Comments