There’s one corner of the semiconductor sector that analysts are particularly bullish on to play the artificial intelligence theme. It’s DRAM, or dynamic random access memory — a type of semiconductor memory needed for data processing. As the use of AI grows, more and more memory is required; indeed, Morgan Stanley called memory the “foundational building blocks” of artificial intelligence. “Memory is a cheaper way to play the start of AI’s explosive new decade-long growth phase,” said Morgan Stanley in a May 29 report, adding that investors are underestimating the impact of AI on the memory segment. Analysts have also highlighted that there’s more upside for the prices of DRAM-related stocks due to production cuts. “Top DRAM producers have lowered their production by 20-30%, and their capital expenditure plans by as much as 50%. Meanwhile, the demand outlook is starting to improve, as customer inventory levels have come down,” Henry Mallari-D’Auria, chief investment officer of emerging markets value at Ariel Investments, told CNBC Pro. AI servers use four times as much DRAM as normal servers, said Mallari-D’Auria, adding that they are set to grow from a mid-single-digit portion of total DRAM revenues to the mid-teens over the next few years. Meanwhile, Citi in a May 30 note said that it was raising the outlook for DRAM average selling prices on the back of the production cuts and solid penetration growth in DDR5 – a type of DRAM. “Our industry checks reconfirm that DDR5 demand, including HBM3 demand from Nvidia, has been rapidly increasing lately,” said Citi. “Considering additional DDR5 demand from Microsoft and Google, we expect DDR5 penetration will rapidly increase throughout the year.” Nvidia and DRAM Nvidia has been at the center of the AI buzz, and it recently reported much better-than-expected results and a huge forecast beat driven by strong AI demand. The firm dominates the market for graphics processing units, or GPUs, which are used to speed up AI processes. The American chipmaker is driving demand for the high bandwidth memory (HBM) segment in the DRAM sector, according to analysts. It’s a key component needed to run Nvidia’s AI processors such as A100 AND H100 — something that regular DRAM cannot do, said Bank of America in a June 1 report. In fact, Nvidia’s competitors AMD , Intel and even some Chinese fabless firms are also promoting AI GPUs, suggesting increased new high bandwidth memory orders, the bank said. “We already observe rising competition (among NVIDIA competitors) to procure more HBM in Korea,” BofA analysts wrote. They added that the global high bandwidth memory market is set to grow from $2 billion in 2022 to $12 billion in 2027. SK Hynix South Korean chipmaker SK Hynix is a favorite among analysts in the DRAM space. Morgan Stanley named it as a top pick and said it was a “key beneficiary of NVDA’s AI opportunity,” while both BofA and Citi are bullish on the stock. Morgan Stanley gave it a price target of 140,000 Korean won ($110), or about 17% upside, while BofA had a 160,000 Korean won price target on the stock, or 34% upside. Mallari-D’Auria, too, believes SK Hynix is well positioned to benefit from AI. “The Korean company is the largest pure-play memory producer in the world and has a particularly strong position in the high-end DRAM products which are used in AI servers,” he said. Samsung Electronics Another top pick for both Morgan Stanley and Citi is Samsung Electronics . The former gave it a price target of 90,000 Korean won, representing potential upside of 25%. Morgan Stanley says Samsung is an indirect beneficiary of tighter supply driven by AI GPU and server demand. “We view our OW call as more defensible on costs, balance sheet strength and the ability to weather a severe downturn better,” Morgan Stanley said. “Expectations are down and valuation is near historical lows, which sets the stock up nicely.” Samsung and SK Hynix had a combined market share of DRAM of around 70%-80% in 2020, according to BofA. — CNBC’s Michael Bloom contributed to this report.