
With the emergence of smaller models and techniques like distillation, will AI scaling stop, and we won’t need more compute power? Richard Ho, head of hardware at OpenAI, firmly believes that AI scaling will continue, and compute power will grow by orders of magnitude in the near future. During his keynote address at Synopsys Snug, he also talked about OpenAI’s in-house AI accelerator, issues related to AI clusters, and AI-empowered EDA tools.
Read the full story at EDN’s sister publication, EE Times.
Related Content
- Key technologies push AI to the edge
- How AI Will Define the Next Silicon Supercycle
- Nvidia plus Arm: What Would It Mean for AI Compute?
- Optical PHYs facilitate 200G/lane speeds for AI clusters
- Custom AI Inference Has Platform Vendor Living on the Edge
The post Will compute continue scaling in AI clusters? appeared first on EDN.