top of page
Exponential Logo.png

X Tech Global Macro Forecasts Ep. 4

Okay, let's dive in. Imagine getting a peek at tomorrow's economic headlines, maybe weeks before they actually hit. We're talking about having, you know, a clearer crystal ball for the economy, not just one that's maybe slightly less cloudy. Today, our deep dive is into this cutting edge world of macroeconomic forecasting. Uh, specifically, we're exploring the innovative approaches of a company called Ext. We've gathered insights from their methodologies, their product offerings, and frankly, some pretty compelling performance data. our mission to really understand how Xtech is delivering these earlier, more accurate economic predictions and uh what that level of foresight means for you, you know, for being truly well-informed in this incredibly fast-paced world.

Yeah. And it's fascinating because traditionally it's always been incredibly challenging to get ahead of economic data, right? I mean, traditional forecasts, they often rely on methods that kind of leave you playing catch-up. You're reacting, not anticipating. You miss that crucial lead time.

Exactly. That lead time.

And that's why this unique unique almost uh orthogonal approach that Xtech brings is so noteworthy. It's not just a little bit better. It's fundamentally a different way of tackling the problem.

So, who are the people behind this this uh potentially groundbreaking shift? Well, at the heart of it is exponential technologies or X-Tech and it's led by CEO Morgan Slade. His background is really key here. I think we're talking 25 years in institutional investing. He's been a senior researcher, senior trader, portfolio manager, uh even global head of trading at some big names. Citadel, Milbour, Richfield, Mel Lynch, you get the picture. Plus, he's got MIT engineering degrees with a specialty in finance. And what seems really relevant is his early work sort of pioneering the use of AI tools to pull investment signals out of data and creating these innovative data products for institutional investors,

right? And if you connect those dots, Xtex method isn't just about getting data faster. It's built differently from the ground up. They're global macro forecasts. They don't just lean on the usual models. They're using cutting edge indicators, their own proprietary alternative data sources and uh advanced artificial intelligence. They build these forecasts as you said bottom up

from independent sources. Okay. And that's what makes them orthogonal to broker forecasts. That's the term they use.

Orthogonal. So totally separate independent insights.

Exactly. It means they're not just echoing or slightly tweaking the existing market consensus which let's be honest can sometimes suffer from a bit of herd mentality. The whole point here is to deliver these insights uh sometimes weeks before those traditional consensus estimates are even put out there.

Okay, so knowing what they predict and how well they claim to do it, the next logical question is well, how what's under the hood? What's driving these uh pretty impressive feats?

Well, first let's talk about just how early and accurate those predictions really are. That's a crucial piece. Let's take the US consumer price index CPI. Big one. Measures inflation, consumer costs, everyone watches it.

Mhm. Key indicator

for CPI. Xtech actually provides multiple forecasts during the month. Their first forecast comes out on the third Monday of the current month. That's like more than three weeks ahead of the official CPI release. Significantly, this is often before most brokers have even published their predictions.

Wow. Okay. Three weeks early.

Yeah. And it uses historical data, sure, but also recent consumer survey data and their alternative data feeds. The metrics they report for the month-over-month percentage change. They show really high correlation and directional accuracy. Basically, they're getting the direction and the rough size of the change right most of the time. Then there's a second forecast. This one comes out around the third trading day of the following month. So still about a week ahead of the official number and it improves on the first one because it folds in some additional government agency data that's just been released. Accuracy ticks up again. And they even have a third forecast just two trading days before the official CPI squeezing in even more lastminute info.

So they're constantly refining it as more data flows in. Exactly. And the sources, you know, they make a pretty bold claim here. They say Xtech has produced the most accurate CPI forecast in the world 25 days before the release and weeks before most estimates are even available.

That's quite a statement.

It is. And it's not just CTI. Take the Conference Board Consumer Confidence Index, another important one. Their forecast for that comes out around the 15th of the current month. So roughly two weeks ahead of the official report. Again, strong correlation, good directional accuracy on predicting the index. level itself.

Okay. And how does this stack up against say the professional economist at the consensus?

That's a really interesting comparison based on data going back several years from late 2017 to early 2025. Xtex forecasts are available on average 12 days before the typical economist predictions hit the street.

12 days earlier.

And according to the materials, they were stated to be more accurate, too. They mention a CPI hit rate. That's the percentage of times their forecast matches the official month over per month CPI down to three decimal places of 36%. And remember that's with an average lead time of 24 days.

36% hit rate nearly a month in advance. That's not just marginal.

No, it suggests something potentially quite powerful. It shifts the needle from just reacting to economic news to maybe possibly proactively positioning for it.

Okay, that sounds incredibly powerful. But how do they actually do it?

How do they maintain accuracy, especially when you know unexpected things happen in the economy all the time? The sources lay out this five-step process using CPI as the example. First, they identify surrogate CPI features. Basically, figuring out which economic indicators actually influence the individual bits and pieces of the CPI. Things like um average hourly earnings, payroll numbers, unemployment, and even their own proprietary data like used car prices,

right? Finding the right ingredients.

Exactly. Second, they gather and prepare data inputs, collect all that relevant data, and then do something called feature engineering.

M Think of it like meticulously prepping those ingredients, right? Not just raw data, but data that's optimized so the AI can make the best possible predictions,

cleaning it, transforming it, making it useful for the models.

Precisely. Third, they forecast individual CPI components. So, they model and predict the level for each specific part of the CPI, food, energy, shelter, etc., using approaches tailored to each one. Fourth, evaluate and refine models. They're constantly checking the performance, seeing what worked, what didn't, and tweaking models to improve themeration. And then finally, step five, they aggregate individual forecasts. They combine all those separate component forecasts to get the overall headline CPI number, that month- over-month percentage change.

Yeah. And what's really clever, the core modeling technique they use, it's called teacher forcing or sometimes one step ahead forecasting. Teacher forcing. Okay. What does that mean in practice?

So, uh, basically the model isn't just making a prediction and then using that prediction to make the next one down the line. Instead, it's constantly being corrected and improved by learning from the most recent actual data that comes in.

Ah, okay. So, it doesn't rely on its own potentially flawed past predictions.

Exactly. That avoids the problem you get with traditional multi-step ahead forecasts where an error early on can just snowball and get bigger and bigger over time. Xtech is essentially ingesting this highfrequency real-time data, applying these advanced statistical models, but always always grounding the next step in the latest real quality. The key thing is the data set is always updating. It's dynamic and the accuracy generally increases as more real-time data flows in closer to the official release date.

Okay, that makes sense. So, they've got these cutting edge forecasts, but you mentioned their insights go beyond just the predictions themselves. They offer historical context, too.

That's right. Something called point in time economic data.

Point in time. Got it. So, what exactly is that?

It lets you look back at historical economic indicators like CPI, GDP, jobs numbers and see them exact. ly as they were reported at any specific moment in the past. Crucially, it captures not just the initial release value but also all the subsequent revisions that happened over time.

Ah because those numbers often get revised later, right? Sometimes significantly.

Absolutely. And this data set has incredible depth. US CPI goes back to 1913. GDP figures start in 1947. It covers all the big ones. GDP growth, inflation, core inflation, non-farm payrolls, trade balances, interest rates, the works.

Okay. And why is seeing the unrevised as it was then data so important?

Well, think about it from the perspective of financial professionals or researchers doing back testing or building models. Using currently available historical data, which includes all those later revisions, introduces something called revision bias or look ahead bias. You're basically giving your historical model information that nobody actually had at the time,

right? You're cheating history in a way.

Sort of. Yeah. M

this point in time data avoids that. It preserves the exact values that were publicly known on each specific date. So it accurately reflects the information that markets and policy makers were actually working with back then. It makes historical analysis much more robust and realistic.

Okay, that's fascinating. So you get the forecast, you get the accurate historical picture. What about market expectations? The consensus view.

Good question because understanding the consensus is also vital context, right? So uh Xtech also provides access to US Reuters economic polls data. These polls are basically a comprehensive snapshot of what the market or at least a broad pool of economists, analysts, and financial institutions is expecting for key macro indicators. Reuters pulls them regularly.

So you can see the average forecast, the consensus number.

Yes, you get the average expected outcome, but importantly, you also get measures of forecast dispersion like what's the highest prediction in the poll, what's the lowest. That gives you a sense of market uncertainty. Is everyone clustered around on number or our opinions all over the map.

Ah, okay. That tells you how much conviction there is or isn't in the market.

Exactly. And these polls are updated regularly capturing how expectations evolve as new information comes out. Plus, they track revisions made by individual economists over time. So, you get the forecast, the history, and the current market sentiment picture.

Right? So, let's pull this all together. What does this whole suite of tools, the early forecast, the point in time data, the consensus polls, what does it mean for you, the listener? Whether you're, you know, managing a portfolio, doing economic research, or maybe just trying to stay sharp on major economic shifts.

Well, the potential applications are pretty broad.

Yeah, the sources list quite a few like uh positioning portfolios getting set before a big macro release drops and potentially moves the market, enhancing trading strategies, obviously folding these macro forecasts into directional bets or maybe relative value trades between assets, arbitrageing market, misprinting sense, interesting, exploiting situations where the consensus seems off based on Xtech's earlier forecast before the actual number comes out.

Mhm. And also managing risk exposure. If you have a better sense of potential shifts in inflation or growth or consumer spending, you can adjust your portfolio allocations ahead of time.

Makes sense. Optimizing asset allocation more broadly too, right? Using these macro trends to guide decisions across stocks, bonds, commodities, currencies, even crypto. They mention improving market timing for getting in or out of positions. Supporting fundamental research with better, more forward-looking macro models and even anticipating surprises, maybe positioning yourself to provide liquidity if there's a market shock caused by an unexpected data point that Xtech potentially flagged early.

Absolutely. And fundamentally, it all boils down to giving users what the source material calls the edge of foresight, not hindsight. It's about enabling proactive anticipation rather than just reactive adjustment, which in today's world with so much information flying around and markets moving so fast, that's essential. It really feels like that. We've taken a pretty deep dive here into how Xtech is trying to push the boundaries of economic forecasting, offering uh what seems like a genuinely unique perspective that really prioritizes getting those insights early and getting them right. So, as you think about the future of economic prediction and how you stay informed, maybe consider this shift.

Yeah. And perhaps here's a final thought to leave you with building on all this. In an increasingly interconnected, rapidly changing global economy, how might this ability, the potential ability to gain weeks of foresight on key economic indicators fundamentally transform things. Not just investment strategies, which we've talked a lot about, but maybe broader economic decision-making processes for governments, for businesses, maybe even for individuals, well beyond just the financial markets themselves. What could that look like?

 
 
 

Recent Posts

See All

Comentarios


Unlock Your Data's Potential Today.

Schedule your free consultation today and discover how we can transform your data strategy.

bottom of page