You are currently viewing Microsoft Steadily Ramps Up Generative AI Innovation And Monetization – Forbes

Microsoft Steadily Ramps Up Generative AI Innovation And Monetization – Forbes


Microsoft (MSFT) is increasingly monetizing AI across its product portfolio and steadily expanding its investments to drive continuous innovation in this space.

Microsoft’s early AI success helped propel the stock last week to a new record high of $433.60. Recently trading around $415, Microsoft shares are up 10.4% YTD.

As more customers use Microsoft’s platforms and tools to build their own AI solutions, Azure is expanding its cloud market share. The company in fiscal Q3 (March) saw an acceleration in the number of large, long-term Azure deals across the enterprise customer base. The number of $100 million+ Azure deals rose more than 80% year over year, while the number of deals worth $10 million+ more than doubled.

In FQ3, Microsoft Cloud revenue advanced 23% to $35.1 billion. Azure revenue growth of 31% surpassed guidance by 300 basis points and topped the consensus estimate of 28.6%. For FQ4 (June), Azure is expected to grow 30% to 31%. Following the FQ3 report, JP Morgan raised its Microsoft price target to $470 from $440, saying Azure growth could accelerate over the next 12 months.

Larger Azure deals in the March quarter helped drive total commercial bookings growth to 31% in constant currency, a significant acceleration from 9% growth in FQ2. Commercial RPO of $235 billion gained 21% in constant currency, representing a sharp acceleration from growth of 16% in FQ2. Microsoft’s total revenue rose 17% to $61.9 billion, topping the consensus of $60.8 billion.

Morgan Stanley believes Microsoft has plenty of runway for growth in AI, as the innovation cycle is just getting started. In FQ3, Azure AI contributed about 700 basis points of growth, up from a boost of 600 basis points in the previous quarter. Morgan Stanley maintained its Microsoft price target of $520.

Goldman Sachs lifted its target by $65 to $515, saying the company offers a unique growth profile at scale, with the ability to expand both revenue and earnings by double-digits in FY’25 (June). Microsoft is well-positioned to capture share of generative AI revenue via its broad suite of AI services and productivity-centric focus, says the firm. Goldman Sachs raised its Azure forecast to better reflect the confluence of factors that can sustain 25% growth through FY’25.

On the FQ3 earnings call, Microsoft CEO Satya Nadella said Azure has become “a port of call for pretty much anybody who is doing an AI project.” AI is bringing in new customers to Azure and powering expansions across the installed base. Nadella also pointed out that AI engagements don’t just sit on their own. While AI projects obviously start with calls to AI models, they also pull in adjacent services—such as vector databases, developer tools and Azure Search.

Microsoft is proving to be one of the pioneers when it comes to AI advancements. While large language models (LLMs) have been getting all of the attention, their hefty size means they can require a significant amount of computing resources to operate. Microsoft has introduced a new class of more capable small language models (SLMs) that will make AI more accessible to more people. These SLMs offer many of the same capabilities found in LLMs, but are trained on less data.

Microsoft recently announced the Phi-3 family of smaller models. These SLMs outperform models of the same size and next size up across a variety of benchmarks that evaluate language, coding and math capabilities. SLMs are easier to use for organizations with limited resources. They’re designed to perform well for simpler tasks and can be fine-tuned to meet specific needs. Customers are able to select either large or small models that are best suited for their use cases.

LLMs are more suited for applications that need orchestration of complex tasks involving advanced reasoning, data analysis and understanding of context. For organizations looking to build applications that can run locally on a device (as opposed to the cloud) and where a task doesn’t require extensive reasoning or a quick response, an SLM will do. SLMs are also good for regulated industries and sectors that need high-quality results, but want to keep data on-premises.

There’s a long-term opportunity to place more capable SLMs on smartphones and other devices that operate at the edge. This would mean AI-infused car computers, traffic systems, smart sensors on factory floors, remote cameras and devices that monitor environmental compliance. By keeping data within a device, users are able to minimize latency and maximize privacy.