As Chair and CEO at AMD, Dr. Lisa Su has spearheaded the company’s transformation into the industry’s high-performance and adaptive computing leader, helping solve the world's most important challenges by delivering the next generation of computing and AI solutions. Before serving as chair and CEO, she was the chief operating officer responsible for integrating AMD’s business units, sales, global operations, and infrastructure enablement teams into a single market-facing organization responsible for all aspects of product strategy and execution.
Dr. Su joined AMD in January 2012 as Senior Vice President and General Manager, of global business units and was responsible for driving end-to-end business execution of AMD products and solutions.
In a fireside chat hosted by the Indian Institute of Science (IISc), Dr.Su shares her insights on driving high-performance computing, AI's transformative potential, and building open, hardware-agnostic solutions for a sustainable, innovation-driven future.
Excerpts from the conversation Dr.Su had with Vinod Ganapathy, Associate Professor in the Department of Computer Science and Automation at IISc.
On the Inspiration to Join the Semiconductor Industry
What inspired me about semiconductors is that chips are the foundation of so many devices. They are the brains of computers and have the potential to change so much about the way we live and work.
When I joined AMD 12 years ago, the company was at a stage where we were looking for what’s next in computing. My interest was to drive high-performance computing and build the largest and most impactful semiconductors possible. That was my reason for joining AMD.
Redefining AMD’s Strategic Vision
Mark Papermaster, our CTO, joined AMD a few months before I did, and he and I became absolute partners in shaping the company’s strategy. We asked ourselves, What do we want to be when we grow up, I mean as a company?
We aspired to be the leader in high-performance computing. To achieve that, we needed to focus on CPUs, GPUs, and the overall architecture required to make that vision a reality. Those were some of the key decisions we made early on.
On the Evolution of AI
AI has been around for a long time, but it was previously a domain of experts who understood how to use it. But all that’s changed in the last two years with the advent of Generative AI and large language models, which were once a very specialized technology, is now available for everyone to use.
When you can use natural language to interact with computing systems, it changes who can access and benefit from this technology. “I believe we are at the very beginning of the global AI revolution. AI is the most impactful and high-potential technology I’ve seen in my career. It has the ability to make us more productive, make businesses more efficient, and help solve some of the world’s most significant challenges."
On Bridging the Gap with NVIDIA
Our approach to AI is rooted in software innovation. If you look at the evolution of AI, much of AI software has been tied to specific hardware, which limits its accessibility.
We need higher-level frameworks that are hardware-agnostic. What’s changing is that nobody wants to program directly at the hardware level anymore—it takes too long, and there are only a few people in the world who can do it.
Our strategy is to create an open-source, hardware-agnostic software ecosystem that works across different platforms. It shouldn’t matter whether you’re using AMD, NVIDIA, or another hardware provider—the software should be seamless. That’s where we’re making significant investments.
On the Specialized AI Models and its Future
I do not subscribe to the thought of a one-size-fits-all approach to AI. While much of the conversation focuses on large GPUs and massive language models, we see opportunities in smaller, specialized models designed for specific use cases.
For example, we're focused on edge AI, which operates closer to where data is generated. We also believe in client AI, where individuals can run their models locally and securely on their devices.
Overall, we expect to see a wide range of devices and models tailored to specific applications, creating a diverse AI landscape.
Blending Sustainability and High-Performance Computing
If I look at high-performance computing, the biggest constraint isn’t performance—it’s power. You can achieve tremendous performance by deploying tens of thousands of CPUs, but the challenge lies in managing power consumption and cooling.
To solve this, we need to consider performance, power, and environmental constraints holistically. Public-private partnerships and collaborations with universities are critical to driving innovation. By working together, we can create sustainable systems that meet these challenges.
On the Role of PhDs
Honestly, I didn’t want to get a PhD. I was eager to finish my degree and start working, but my parents insisted, saying it was a once-in-a-lifetime opportunity. They were right.
The PhD process isn’t just about the specific subject matter; it’s about problem-solving. It’s about learning how to tackle difficult, unsolved problems. It teaches you discipline, perseverance, and the confidence to solve challenges that nobody else has addressed.
While a PhD may not be essential for everyone, the skills you gain from that process can be applied repeatedly throughout your career.