Introduction

It was a quick one this week. I watched the latest update to Azure by John Savill, and while it’s a shorter episode, there are a few things worth mentioning. I’m especially interested in the improvements to Cosmos DB mirroring and the new GPT-5.2-Codex model.

Before you start: John said his role at Microsoft had changed. He is now the Chief Technology Officer of America’s Markets and Industries. Well done to him. Even better, he confirmed that it won’t affect the YouTube channel. That’s good to know because his weekly updates are really helpful for keeping up to date.

AKS: Ubuntu 24.04 Now GA

Ubuntu 24.04 is now generally available for AKS. This applies to Kubernetes 1.32 and above.

Key points:

  • It will become the default on Kubernetes 1.35
  • Uses containerd 2.0
  • All the usual benefits of a newer Ubuntu LTS

If you use AKS clusters, this update is mostly useful to know. You don’t need to do anything now, unless you want to opt in early. When 1.35 comes around, get ready for the change.

Cosmos DB: Fabric Mirroring with Private Networking

This one is actually useful for business.

Here’s what happened: You have Cosmos DB with private endpoints or VNet restrictions. You want to send that data to Fabric’s OneLake so it can be analysed. Before, network restrictions stopped mirroring.

You can now use a network ACL bypass feature to enable Microsoft Fabric to access Cosmos DB for mirroring.

Just be aware that if you’re using private endpoints only, you’ll need to temporarily allow public access when setting up the mirroring. Once you have finished exploring, you can make the site public again.

It’s not perfect, but it solves a big problem for organisations that want to have tight network control and the benefits of integrating Fabric.

GPT-5.2-Codex: The New Coding Model

This is the interesting one. You can now find GPT-5.2-Codex in Azure AI Foundry and GitHub Copilot.

The numbers:

  • 400,000 token context window (roughly 100,000 lines of code)
  • 50+ programming languages supported
  • Multimodal: accepts code, natural language, and images

The context window is very big. You can give it an entire set of code, designs for how the user interface will look, and documents explaining these designs, and it will understand everything. This makes it really useful for:

  • Large-scale refactoring projects
  • Migration work
  • Iterative development across complex systems

I haven’t tested it a lot yet, but I’m most interested in the multimodal aspect. If we can create a UI mockup image and have it generate code, it could change how we prototype.

OptiMind SLM: Specialized Optimization Model

Microsoft Research has released OptiMind, a small language model that is designed to solve optimisation problems.

Use cases:

  • Workforce scheduling
  • Supply chain design
  • Network deployment planning
  • Financial portfolio optimization

It’s designed for solving problems where you have to find the maximum or minimum of a function, given some constraints. The model identifies the problem, gets hints based on the problem class, and generates solutions with optional self-correction.

The main benefit is that specialised models are better than general-purpose LLMs for specific problem types. If you’re solving optimisation problems today with a general LLM, this might be better.

You can try it in Azure AI Foundry now.

Final Thoughts

There weren’t a lot of announcements this week, but the improvements to Cosmos DB mirroring and GPT-5.2-Codex are worth keeping an eye on. The trend of specialised models like OptiMind is interesting too. We’re moving past the idea that one model can do everything. We’re now using tools that are designed for specific problems.

That’s how it should be.


Sources

  1. John Savill, “Azure Weekly Update - 16th January 2026,” YouTube, https://www.youtube.com/watch?v=0U9CjXk5o2E