Discover how to create AI-powered GTM flywheels at our live event on Oct 17th
Register Now

What is an Adapter?

An Adapter in generative AI is a small module or layer inserted into pre-trained transformer-based models (Google Cloud, 2023). This layer's parameters are the only ones adjusted during training, aiding in the transfer of information and skills between different pre-trained models (IBM Research Blog, 2023). The adapter projects the original feature size to a smaller dimension, then back to the original size, serving as a compact and extensible method for transfer learning.

Types of Adapters

There are several different types of adapters in generative AI:

  • Knowledge adapters - These adapters allow models to incorporate different types of knowledge or capabilities, like common sense or reasoning, without retraining the entire model. For example, a knowledge adapter could add physical common sense to a language model.
  • Task adapters - Task adapters specialize a model for different downstream tasks, like classification, translation, or summarization. Each task has its own adapter that gets inserted into the base model architecture. This allows a single model to be adapted for many different tasks.
  • Multi-task adapters - With multi-task adapters, a single model can learn multiple tasks simultaneously by having multiple task adapters inserted. This improves efficiency by avoiding the need to retrain entire models per task.
  • Language adapters - Language adapters enable models to handle multiple languages. Rather than training separate models per language, language adapters allow a single model to translate between languages by switching adapters.

Adapters provide an efficient way to specialize generative AI models for different domains without extensive retraining. Their flexibility makes adapters a key technique in transfer learning.

Everyday Examples of Adapters

Here are three everyday examples of adapters that help explain their function using analogies:

  • Phone charger adapters - Just like phone charger adapters allow you to connect your phone's charger cable to different types of electrical outlets, adapters in AI allow models to connect to and use information from different pre-trained models.

  • Travel adapters for outlets - When traveling abroad, travel adapters act as a bridge to let you plug devices from your home country into foreign electrical outlets. AI adapters serve a similar role, enabling models to plug into and utilize knowledge from various pre-trained models.

  • Language interpreters - Interpreters act as a translation interface between two people speaking different languages. Similarly, adapters translate between the "languages" of different AI models, facilitating transfer learning.

Impact on Teams

The use of adapters in generative AI models can have several benefits for engineering and data science teams:

  • More efficient transfer learning - Adapters allow teams to leverage existing pre-trained models and fine-tune them for new tasks and domains without having to retrain the entire model. This makes transfer learning more efficient.

  • Reduce compute costs - Since only the adapter parameters need to be trained rather than the full model, this can significantly reduce training costs and time.

  • Faster iteration - Engineers can rapidly customize models for new applications by inserting task-specific adapters. This enables faster experimentation and iteration.

Overall, adapters enable teams to be more agile with AI model development, save on compute resources, and accelerate time-to-value. By facilitating transfer learning, adapters allow teams to tap into existing state-of-the-art models rather than building custom models from scratch.

Impact on Customers

AI assistants powered by adapters could allow for a wide range of customizable solutions for customer needs. With adapters enabling models to be quickly fine-tuned, companies can tailor conversational bots and other AI tools to be more responsive to individual preferences.

This also allows for faster iterations and updates to AI products as new skills and languages are added via adapters. Customers would benefit from these rapid improvements in capability.

By leveraging adapters to transfer knowledge between models rather than training from scratch, costs for training and deployment are reduced. These savings can then be passed on to customers in the form of lower prices.

Overall, adapters facilitate more customizable, rapidly-evolving, and affordable AI systems, providing better experiences for end users.