My first articles were about systems, execution, memory, fine-tuning, and reliability.
Those topics matter on their own.
But for me, they also point toward something larger:
robotics.
If I zoom out, I do not see agentic systems as the final product category. I see them as the path toward embodied systems that can operate in the physical world.
In other words:
the broader vision is not only software that reasons.
It is software, models, memory, training, and execution becoming able to act through machines.
That is why robotics matters so much to me.
Agentic Systems Are a Bridge
A lot of people talk about agents as if the destination were digital automation alone.
I do not think that is the full picture.
Agents matter because they force us to solve the right intermediate problems:
- >planning
- >memory
- >reasoning
- >specialization
- >evaluation
- >reliability
- >adaptation
Those are exactly the capabilities you need before physical systems can become truly useful at scale.
A robot without these layers is mostly hardware.
A robot with them starts to become a system.
The Humanoid Question
When people think about robotics, they often jump directly to humanoids.
I understand why.
There is something deeply human in the idea. We are naturally drawn to the possibility of building physical cousins of ourselves, almost in the same way that we imagine extraterrestrial life: not just tools, but other embodied intelligences we can relate to.
That vision is powerful.
But I do not think it should blind us to what robotics already is today.
Most useful robots are not full humanoids.
They are:
- >drones
- >robotic arms
- >wheeled machines
- >small educational companions
- >narrow industrial systems
That matters because the market will not wait for perfect humanoids to start transforming work.
It is already moving through smaller, more specialized bodies.
Embodiment Matters More Than Shape
For me, the real question is not "Does it look human?"
The more important question is:
That can happen through many forms.
A Reachy Mini used by children or teachers.
A safety assistant helping construction teams reduce risk.
A robotic arm moving parts or materials.
A mobile unit inspecting, transporting, or monitoring.
These systems do not all need the same body.
What they need is useful embodiment.
The physical form should follow the task, not the mythology.
Why Robotics Will Drive Products and Services
I think the coming decade will push robotics much deeper into real products and services.
Not only because hardware keeps improving, but because the software layer around it is finally catching up.
What changes the equation is the convergence of several forces:
- >better AI models
- >better training pipelines
- >better simulation and evaluation
- >cheaper hardware
- >more open source building blocks
- >stronger developer ecosystems
This is why robotics now feels less like a distant moonshot and more like an emerging product category.
The stack is becoming composable.
And once the stack becomes composable, experimentation accelerates.
Open Source Will Matter More Than People Think
One of the most important accelerants in this space will be open source.
Not because open source solves everything, but because it lowers the cost of iteration and distribution.
When models, training recipes, control systems, hardware interfaces, and simulation tools become easier to access, more people can build.
That matters enormously.
A robotics ecosystem does not grow only from large labs.
It also grows from researchers, startups, schools, hackerspaces, developers, and niche product builders who keep shipping smaller experiments until the category matures.
That is often how real technological markets are formed.
Near-Term Use: Humans Become Creative Again
There is also a more immediate reason why I care about these first three articles and the systems behind them.
In the near term, the purpose is not to replace people with machines.
The purpose is to make humans more capable again.
Better execution systems, better memory, better reasoning, and better specialization can help people recover time, focus, and creative bandwidth.
Humans still do the matching.
Humans still define the direction.
Humans still decide what is meaningful.
But AI can remove a large amount of friction between imagination and execution.
That is already valuable, even before robotics reaches maturity.
From Imagination to Physical Service
What excites me most is the continuity between these layers.
First, AI helps humans think, organize, design, and execute better.
Then those same systems become capable of guiding physical tools.
Eventually, you get products and services that are not only digital, but embodied.
That could mean robots that:
- >demonstrate concepts in classrooms
- >assist teachers and children in learning environments
- >improve safety in construction and industrial contexts
- >save time on repetitive physical handling
- >help people supervise or coordinate real-world operations
That is where imagination becomes service.
Not because the machine replaces human purpose, but because it extends human reach.
The Real Opportunity
To me, robotics is not a side branch of AI.
It is one of the most concrete destinations of the whole movement.
If intelligence remains trapped inside screens, it will still be useful.
But once intelligence can reliably connect to tools, bodies, sensors, and physical environments, the economic and social impact becomes much larger.
That is when product categories multiply.
That is when services become embodied.
That is when AI starts to reshape daily life in visible ways.
Final Thought
The first layers are about making systems able to think, know, decide, and improve.
Robotics is what happens when those layers gain a body.
Humanoids may become one part of that future.
But the deeper shift is broader than humanoids.
It is the emergence of useful embodied systems across education, industry, safety, logistics, assistance, and everyday work.
That is why I see robotics not as a separate topic, but as the continuation of the same architecture.
First the system learns to reason.
Then it learns to act.
Then it enters the physical world.
Want to discuss multi-agent patterns?
We love talking about orchestration, AI workflows, and engineering challenges.