How to Make Sure Your Next Computer Is AI Ready

Artificial Intelligence is already changing the way many of us interact with the internet, not to mention our computers. With AI tools like Copilot and ChatGPT expected to grow in Windows and macOS machines, buying computers is about to get much more interesting.

The rapid advancements in AI technology may feel dizzying, but one thing we’re learning is that if you’re buying new computer hardware today, you need to think about how it will run AI tomorrow, especially if you’re thinking about running local AI, rather than cloud-based versions. 

It might seem easy just to buy the best computer you can get right now. Sure, you’ll probably end up with specs that broadly work for what you want. But if you’re hoping to future proof as much as you can for AI’s future needs, it’s important to think about features now that will help you the most, such as machine learning chip technology, RAM / memory, and hard drive space.

Machine Learning and AI chips

The easiest feature to look for in any new purchase is Machine Learning and AI-specific hardware. 

AMD, NVIDIA and Apple have all begun putting additional AI features into their chips, be they graphics cards or full silicon packages. We’re still learning what capabilities these extra features offer for everyday people, but we do know that right now they are already being used to speed up editing for video, photo, and audio. 

Intel has been a frontrunner in integrating AI into its latest Core Ultra processors, specifically designed to optimize AI processing, especially if you’re putting AI tools right on your local hardware, rather than in the cloud. AMD has followed suit, talking about processors with enhanced AI capabilities like the Ryzen 7 8700G, and NVIDIA continues to impress with its AI-centric advancements, both in hardware and software. 

One key way we’re likely to benefit from these chips with AI features is through offline assistants, like Siri, Google Assistant or some others. They’re not the same as Google’s Gemini or ChatGPT, which can answer questions with complex reasoning like “How many M&M’s can I fit into the trunk of a Hyundai Sonata?” but instead they can speed up tasks like calling a friend, setting timers, creating to-do items, or even organizing your calendar.

AI is also increasingly being used to help process images. Many photo apps can already identify people, pets and places across photos. Apple and Google both have search bars you can type “beach” into, and get images of the beach. These types of features will likely get better as AI technology advances on your devices.

It’s all about the memory

Other than ML-specific chips built by Apple, AMD, Intel, and NVIDIA, the next thing you need to look out for the most is memory.

On-device AI technology requires as much memory, both RAM and storage space, as you can give them, and that’s not an exaggeration. One of the best models you can run on a computer today, Llama 2 from Facebook-owner Meta, can use as much as 37GB of RAM. Yes, that’s RAM. 

Meanwhile, NVIDIA’s new Chat with RTX, available as a free download for PCs with NVIDIA 30-series and 40-series GPUs, needs around 36GB of storage space just to download the installer. 

While it may seem like an eye-popping amount of memory today, don’t get too worried yet. It’s more likely that on-device AI’s for now will be kept to more simple tasks, which require a fraction of the system memory to run, while we save some advanced tasks for web-based AI’s like OpenAI’s ChatGPT, Google’s Gemini, Microsoft’s Copilot and more. 

It’s hard to say exactly how much memory we all need, but it’s a safe bet if you plan to buy computers that last more than a couple years, you’ll want to plan for at least 16 GB of memory, but more likely 32GB. 

ARM vs x86

One of the more interesting bits we’ve learned about AI technology so far is how GPU-dependent it all is. And oddly enough, for now it turns out that ARM-based chips have tighter integration between their GPUs, memory and CPUs.

One key example is Apple, whose M-series chips are competitive with similar devices planned by Intel, AMD and mobile chip maker Qualcomm

The good news for consumers is that competition is likely to drive down prices and drive up innovation. Microsoft has already said it plans to bake AI into many aspects of its upcoming Windows updates, and Apple is rumored to not be far behind either. New AI-forward Windows systems based on Qualcomm ARM chips are also expected sometime in 2024.

Leave a Reply

Your email address will not be published. Required fields are marked *

Main Menu