The News: Qualcomm and Meta recently announced that they are working together to optimize the execution of Meta’s Llama 2 language models directly on-device without relying on the sole use of cloud services. Read the Qualcomm Press Release for more information.
Qualcomm Works with Meta on On-Device AI: On-Device AI Without a Cloud in the Sky
Analyst Take: Without a doubt, the use of AI will be in a hybrid state or traverse between on-device and the cloud. The partnership between Qualcomm and Meta is another indicator of that momentum. A few catalysts for the growth of on-device AI include instances when the user experiences a lack of Wi-Fi and cellular connectivity, or the user may be prohibited from sharing sensitive information on popular sites such as ChatGPT. In fact, due to the allure of increasing productivity, we continue to hear more about users unintentionally sharing sensitive information, which is forcing organizations to provide further training on what is privileged, or is sensitive information for risk mitigation efforts.
On-Device AI Example: Your AI Assistant, Putting the Odds in Your Favor
My viewpoint is we are still in the early innings of how on-device AI will improve our lives and a key example is the use of virtual assistants learning about you and aggregating the things you need. Imagine it is a Thursday evening and you decide to take an impromptu camping trip to decompress. So, you speak into your smartphone and tell your virtual assistant you want to go camping in the desert. Your device knows a lot about your hobbies such as music, botany, astronomy, and geology together with other things such as the medications you take, medical diagnostics from other devices, allergies, and other information.
In a nutshell, the image below is a simplified chain reaction of things that your assistant “may” do for you. It may find AI-based applications that synchronize with your hobbies for a richer experience while loading and recommending other applications that may help you to keep you safe. As an example, when you finally arrive at your campsite, you might take your phone and scan areas around you that could offer you more insights about insects, reptiles, animals, and other things that might be creeping around, and provide you advice on how to avoid or safely interact with them.
On-Device AI Applications You May Use on a Camping Trip
Here are just a few potential on-device applications:
- Insect AI Identification: Insect identification allows the user to take a video or picture of an insect and the application quickly identifies what it is, together with more background information on the insect. As an example, you might be in the desert and find a scorpion and you take a picture of it. The application will tell you what species it is, if it is poisonous, how to set up your campsite and how to avoid them, or what to do if you are accidentally bitten by one of them.
- Geology AI identification: The AI-based geology or rock identifier app allows the user to take a picture of a rock or terrain, which could be useful for hobbyists if they are looking for volcanic rocks such as geodes, for example. Also, the application may allow you to take a picture of the landscape to show you patterns where floods may occur.
- Astronomy AI Identification: The AI-based stargazing application allows the user to help point their telescope to a constellation, planet, comet, or star. Also, the device can tell you when certain satellites will pass overhead.
- Dermatology AI Identification: The AI-based dermatology application allows the user to take a picture of themselves or the skin of a family member which may help in a variety of situations. As an example, it may detect if the person was bitten by an insect and is infected to see if it is injected and turned into cellulitis. The application may provide an alert that you experienced an extreme sunburn and may link to the tree and plant app to help you find an aloe vera plant to treat it.
The Qualcomm and Meta Partnership
Qualcomm is working with Meta to implement Llama 2-based AI language models on flagship smartphones and PCs starting in 2024, which I believe is an excellent idea since it will enable development teams the ability to create new and exciting generative AI applications using AI-based capabilities on Snapdragon platforms. The key benefits of the relationship include:
- Cost Savings: The ability to run AI-based applications on-device will result in fixed and variable cost savings by reducing or eliminating per-query cloud costs. Also, this approach has the potential to save on networking costs since fewer users will be taxing the wireless access points or network. Another variable cost savings is the ability to write documents that have confidential information on them to be run locally, leveraging AI.
- Personalization: Running AI-based applications locally on the device allows a much broader opportunity for customized experiences since the device is constantly learning the behavior of the user whether they are texting, talking, taking pictures, or capturing videos. Thus, as mentioned above in the virtual assistant example, your device knows you and can optimize your life and keep you safer.
- Privacy and Security: More than ever, there is a problem with users having minimal discernment about what type of information at a corporate level is confidential versus non-confidential. As an example, hackers will often use open-source intelligence (OSINT) techniques to build a profile on a target (e.g., executive, etc.) to launch a phishing attack. Thus, having employees using LLMs for productivity locally will undoubtedly mitigate the risk of exposing confidential information.
- Application Fidelity: The ability to run AI on the device also limits impaired performance in the event of natural disasters, blackouts, or other events. Downtime can be limited when workers are able to use AI for content creation or quantitative analysis.
Viewpoint
We will be living in a hybrid AI world that parallels what happened in the SaaS or IaaS market. Some organizations will be comfortable letting their employees use AI in the cloud while others will not, which will vary depending on the vertical market, department, or sensitive nature of the information. Thus, on-device AI will accelerate and be used to improve productivity in the areas of content creation tools, productivity applications, virtual assistants, and much more.
As mentioned in the camping example above, in the future, your on-device AI assistant will also cull other AI-based applications that make your life easier since it has a baseline of your activity. In fact, many of those models are less than 100 MB, which also means we will continue to see developers seeking to create more AI-infused lightweight applications that can be run on the device in the future, leveraging various techniques such as quantization and others.
Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.
Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.
Other insights from The Futurum Group:
AMD and Hugging Face Team Up to Democratize AI Compute
Generative AI Investment Accelerating $1.3 billion for LLM Inflection
Qualcomm’s 212S and 9205S: Empowering IoT Tracking in Hard-to-Reach Areas