Fwiw the AI Tesla and xAI are pursuing are pretty different. Tesla is doing narrow self driving + venturing into world model cause/effect stuff which is where one camp is AI folks think is the next big wave after LLMs, xAI is more in the LLM space which is where the other camp believes will achieve AGI, idk where they're going long term. xAI doesn't do any real-world AI, Tesla only does real-world AI in comparison... It'd be weird for Tesla to be shipping code assist or image generation for example.
I actually really like the structure - I don't find any of musk's companies to be competing with each other, they're all quite complementary in their efforts & hyper-focused. I'm glad Tesla doesn't have its own LLM, it'd be Meta-quality at best and a distraction from optimus and FSD.
Lately I'm a bit worried if Teslas FSD tech stack is going to be out of date. Like they invented everything themselves out of necessity, but now there are going to be platforms for this kind of stuff. Although still Tesla getting there first is going to be huge
Getting there first isn’t nearly as valuable in cars as it is in traditional tech. Cars are very slow to be replaced. There’s also no network effect. It’s not like if you have a killer app and every car on the road is yours. It helps you for that 1-2 years of sales but that’s it.
Since you said get there first I assumed you meant available for individuals to purchase. Because they’re not close to first on the robotaxi train. Waymo is obviously the most well known but MayMobility and now even Zoox operate in the US and WeRide, MayMobility, Baidu, PonyAI and AutoX operate in other markets.
Tesla can be 7th globally tho! But it has to beat AVRide to the punch on that. Otherwise it might be relegated to 8th or 9th.
Tesla is doing narrow self driving + venturing into world model cause/effect stuff which is where one camp is AI folks think is the next big wave after LLMs, xAI is more in the LLM space which is where the other camp believes will achieve AGI, idk where they're going long term.
These two things are really sort of the same. For instance, VLAs are just multi-modal LLMs. I wouldn't really characterize them as different disciplines. There are implementation differences, but they aren't different technologies.
To wit:
xAI doesn't do any real-world AI
Take note how Gemini Robotics is a Gemin-based VLA doing CoT. Same same, but different. As we move into foundation models the phenomenon will continue. There's no way around this, Xai is quickly going to end up in a conflict-of-interest predicament and I honestly have no idea how this gets solved. They will end up overlapping, and therefore competing.
I guess we'll have to see how things play out in practice. I find it unlikely xAI moves into realtime robotic sensing and manipulation for example, I'm not sure where that'd fit into their business model or brand. There's certainly technological overlap to some degree but the applications and inference HW seem likely to further diverge significantly with time.
I get what you're saying that there's some overlap in that ultimately the systems boil down to seeing and thinking, I just suspect the applications are different enough to warrant separation... For example Tesla is still going to be developing 1p inference HW and codedeveloping them for training & their specialized models.
There's a massive opportunity for xAI that doesn't compete with Tesla, and the money in their space probably isn't in building autonomous robotics - unless you think there's a real moat they could build there?
With all due respect, where is your overwhelming confidence coming from?
You use the term LLM when I believe you actually mean neural nets w/ deep learning. LLMs are an application specific token based neural net, and are primarily an unsupervised learning application.
Tesla’s FSD (learning the roads) is a fundamentally SUPERVISED learning application.
They have completely different data sets and each model is a mirror of its dataset. (And user labeling)
Yeah, but sorry no convergence anytime - from my understanding.
Idk if you’re just being super pedantic or if you have no idea what you’re talking about, but supervised vs. unsupervised training definitely exists. I mean usually it’s referred to as supervised or unsupervised learning, but it does refer to whether there is a labeled dataset used for the training and testing of a model…
I think it's mostly fine? Like if there were a company which specializes in making semi trucks and one which only makes cars. They both have very similar factories and talent needed but they can still work together without conflict of interest.
Pretty hard to know if someone might change jobs and it’s really difficult to stop them. 5 years is a long time to work at one company in the Bay Area. Average is 2 yo 3 years for tech workers.
xAI doesn’t have the expertise nor experience in designing and mass manufacturing physical products like Tesla. They won’t build a team from the ground up to build robots. They also won’t develop their own AI training and inference chips. Their target is replacing dinosaur software companies like Microsoft that make one size fit all software with MacroHard and allows companies and individuals to create applications tailored to their specific needs on demand using AI.
Tesla is complementary starting with putting Grok in their vehicle and will use Grok in Optimus Bots to interactive with uses. They will also supply chips to xAI that will eventually be sent to space by SpaceX in 5 years to build millions of AI data centers, which can be use by xAI to run Grok.
They are not the same. The difference between running your model on an inexplicably expensive supercomputer the size of a football field is incomparable to running your model on a 10K unit that fits in the dash. That factor alone necessitates dramatically different approaches to how you design the model.
Not even close to informed commentary. Your average 30B model can run on a well-specced personal computer, there's an entire subreddit of people doing so at r/LocalLLaMA. The supercomputers are for training, not for inference, and the process of training is fundamentally the same at 3B, 30B, and 300B.
the process of training is fundamentally the same at 3B, 30B, and 300B.
I think this is highly dubious claim unless we're talking in really really broad strokes. E.g. with smaller models you have a large reliance on distillation and with larger models you have MoE & are likely leaning heavier on nicher sorts of distributed training & working on a broader scope. Smaller vs larger models also have different optimization targets, e.g. latency vs memory bandwidth vs cost efficiency vs space efficiency... rag + tool use.
Granted, I don't work on ML (aside from hobbyist stuff in the past + uni which by now is outdated) but I certainly have dealt a lot with embedded, mobile, and distributed systems. I have a hard time believing there's such massive overlap between those two worlds in any domain, because the tradeoffs are always immense, unless we're saying that for example that graphics on CPU and GPU are basically the same because both have lighting models and textures, meshes, render targets...
I also thought that massive Elon pay package was supposed to be so that he would keep his focus on Tesla and not take all the AI work over to xAI. Weird.
xAI completed its upsized Series E funding round, exceeding the $15 billion targeted round size, and raised $20 billion. Investors participating in the round include Valor Equity Partners, Stepstone Group, Fidelity Management & Research Company, Qatar Investment Authority, MGX and Baron Capital Group, amongst other key partners. Strategic investors in the round include NVIDIA and Cisco Investments, who continue to support xAI in rapidly scaling our compute infrastructure and buildout of the largest GPU clusters in the world.
This new development comes from a Bloomberg report detailing xAI’s widening losses and massive cash burn. According to the report, xAI executives explicitly told investors that their goal is to “develop self-sufficient AI to power robots like Tesla’s Optimus.”
This is a massive shift in the narrative and potentially the smoking gun in ongoing shareholder lawsuits regarding Musk’s breach of fiduciary duty.
The Bloomberg article referenced by Electrek reports the following:
xAI reported a net loss of 1.46 Billion in Q3 '25, compared to a net loss of 1 Billion in Q1 of that year.
xAI spent 7.8 Billion total for Q1 through Q3 '25, and was running out of cash.
xAI's revenue was just 107 million in Q3 '25, and about 59 million in Q2 '25
Some of xAI's CapEx projects are funded by debt, and xAI worked with VC firms to form a Special Purpose Vehicle company to raise money through issuing debt
These finances are looking extremely ugly: Billions of investment that is barely generating any income.
If this reckless level of spending (on AI in the tech industry generally) doesn't result in anything useful, I believe the collapse is likely to be as spectacular as the Dotcom bust.
I thought Tesla was the AI company ??
Fwiw the AI Tesla and xAI are pursuing are pretty different. Tesla is doing narrow self driving + venturing into world model cause/effect stuff which is where one camp is AI folks think is the next big wave after LLMs, xAI is more in the LLM space which is where the other camp believes will achieve AGI, idk where they're going long term. xAI doesn't do any real-world AI, Tesla only does real-world AI in comparison... It'd be weird for Tesla to be shipping code assist or image generation for example.
I actually really like the structure - I don't find any of musk's companies to be competing with each other, they're all quite complementary in their efforts & hyper-focused. I'm glad Tesla doesn't have its own LLM, it'd be Meta-quality at best and a distraction from optimus and FSD.
Lately I'm a bit worried if Teslas FSD tech stack is going to be out of date. Like they invented everything themselves out of necessity, but now there are going to be platforms for this kind of stuff. Although still Tesla getting there first is going to be huge
Getting there first isn’t nearly as valuable in cars as it is in traditional tech. Cars are very slow to be replaced. There’s also no network effect. It’s not like if you have a killer app and every car on the road is yours. It helps you for that 1-2 years of sales but that’s it.
You are mistakingly treating robotaxis as replacements for cars. An AV is not a replacement, but a different thing from a regular car.
Since you said get there first I assumed you meant available for individuals to purchase. Because they’re not close to first on the robotaxi train. Waymo is obviously the most well known but MayMobility and now even Zoox operate in the US and WeRide, MayMobility, Baidu, PonyAI and AutoX operate in other markets. Tesla can be 7th globally tho! But it has to beat AVRide to the punch on that. Otherwise it might be relegated to 8th or 9th.
These two things are really sort of the same. For instance, VLAs are just multi-modal LLMs. I wouldn't really characterize them as different disciplines. There are implementation differences, but they aren't different technologies.
To wit:
Take note how Gemini Robotics is a Gemin-based VLA doing CoT. Same same, but different. As we move into foundation models the phenomenon will continue. There's no way around this, Xai is quickly going to end up in a conflict-of-interest predicament and I honestly have no idea how this gets solved. They will end up overlapping, and therefore competing.
I guess we'll have to see how things play out in practice. I find it unlikely xAI moves into realtime robotic sensing and manipulation for example, I'm not sure where that'd fit into their business model or brand. There's certainly technological overlap to some degree but the applications and inference HW seem likely to further diverge significantly with time.
I get what you're saying that there's some overlap in that ultimately the systems boil down to seeing and thinking, I just suspect the applications are different enough to warrant separation... For example Tesla is still going to be developing 1p inference HW and codedeveloping them for training & their specialized models.
There's a massive opportunity for xAI that doesn't compete with Tesla, and the money in their space probably isn't in building autonomous robotics - unless you think there's a real moat they could build there?
With all due respect, where is your overwhelming confidence coming from?
You use the term LLM when I believe you actually mean neural nets w/ deep learning. LLMs are an application specific token based neural net, and are primarily an unsupervised learning application.
Tesla’s FSD (learning the roads) is a fundamentally SUPERVISED learning application.
They have completely different data sets and each model is a mirror of its dataset. (And user labeling)
Yeah, but sorry no convergence anytime - from my understanding.
I'm a software engineer with ML experience.
I'm using the term LLM to refer to Large Language Models.
These are all gibberish sentences. Training is not supervised or unsupervised.
Idk if you’re just being super pedantic or if you have no idea what you’re talking about, but supervised vs. unsupervised training definitely exists. I mean usually it’s referred to as supervised or unsupervised learning, but it does refer to whether there is a labeled dataset used for the training and testing of a model…
Driving datasets aren't inherently labelled. Unlabelled training is common at many (most, actually) layers of the stack.
https://en.wikipedia.org/wiki/Dunning–Kruger_effect
Ironically nailed it — you don't know what you're talking about.
I think it's mostly fine? Like if there were a company which specializes in making semi trucks and one which only makes cars. They both have very similar factories and talent needed but they can still work together without conflict of interest.
Tesla already lost engineers to xAI but they were already supposed to leave regardless. Allegedly
Pretty hard to know if someone might change jobs and it’s really difficult to stop them. 5 years is a long time to work at one company in the Bay Area. Average is 2 yo 3 years for tech workers.
xAI doesn’t have the expertise nor experience in designing and mass manufacturing physical products like Tesla. They won’t build a team from the ground up to build robots. They also won’t develop their own AI training and inference chips. Their target is replacing dinosaur software companies like Microsoft that make one size fit all software with MacroHard and allows companies and individuals to create applications tailored to their specific needs on demand using AI.
Tesla is complementary starting with putting Grok in their vehicle and will use Grok in Optimus Bots to interactive with uses. They will also supply chips to xAI that will eventually be sent to space by SpaceX in 5 years to build millions of AI data centers, which can be use by xAI to run Grok.
They are not the same. The difference between running your model on an inexplicably expensive supercomputer the size of a football field is incomparable to running your model on a 10K unit that fits in the dash. That factor alone necessitates dramatically different approaches to how you design the model.
Not even close to informed commentary. Your average 30B model can run on a well-specced personal computer, there's an entire subreddit of people doing so at r/LocalLLaMA. The supercomputers are for training, not for inference, and the process of training is fundamentally the same at 3B, 30B, and 300B.
I think this is highly dubious claim unless we're talking in really really broad strokes. E.g. with smaller models you have a large reliance on distillation and with larger models you have MoE & are likely leaning heavier on nicher sorts of distributed training & working on a broader scope. Smaller vs larger models also have different optimization targets, e.g. latency vs memory bandwidth vs cost efficiency vs space efficiency... rag + tool use.
Granted, I don't work on ML (aside from hobbyist stuff in the past + uni which by now is outdated) but I certainly have dealt a lot with embedded, mobile, and distributed systems. I have a hard time believing there's such massive overlap between those two worlds in any domain, because the tradeoffs are always immense, unless we're saying that for example that graphics on CPU and GPU are basically the same because both have lighting models and textures, meshes, render targets...
With 300ms response times?
I would think xAI would do the Optimus too, if ever Tesla lacks computer power to do it themselves and/or want to keep it solely for FSD.
I also thought that massive Elon pay package was supposed to be so that he would keep his focus on Tesla and not take all the AI work over to xAI. Weird.
Tesla is the world model company
When profits?
https://electrek.co/2026/01/09/elon-musk-xai-build-ai-tesla-optimus-amid-breach-of-fiduciary-duty-lawsuit/
The Bloomberg article referenced by Electrek reports the following:
These finances are looking extremely ugly: Billions of investment that is barely generating any income.
If this reckless level of spending (on AI in the tech industry generally) doesn't result in anything useful, I believe the collapse is likely to be as spectacular as the Dotcom bust.
Love it