Using Multiple AI Models

FreeToken AI Agents come with one AI model selected. When the device session is registered, the Agent is selected and the default model is assigned. However, there are use-cases where you may want to use different AI models in your application. This can be done easly by using the modelCode parameter in many of the SDK methods.

Tip: Finding Model Codes

To find model codes for any of the supported AI models, you can either look them up via the SDK by using listAllModels or view them on the web console.

await FreeToken.shared.listAllModels(
  success: { models in
    for model in models {
      print("Model Code: \(model.modelCode), Name: \(model.name)")
    }
  },
  error: { error in
    print("Error fetching models: \(error.localizedDescription)")
  }
)

Tip: Message Threads

Message Threads are not tied to a specific AI model or Agent. You can use any AI model with any message thread for an App. This allows you to choose what might be the best model for the next message.

Using a different model

In this example, we will use a different model from beginning to end. Since the model is not the Agent's default model, you will need to pass the modelCode parameter to each method.

// You have already registered a device session

FreeToken.shared.downloadAIModel(
  modelCode: "gemma3n_e2b_it",
  success: { downloadState in 
      // Use completion
      FreeToken.shared.generateCompletion(
        modelCode: "gemma3n_e2b_it",
        prompt: "Hybrid AI is "
      )
  },
  error: { error in 
    print("Error downloading model: \(error.localizedDescription)")
  }
)

Cloud Only Models

FreeToken not only supports cloud versions of on-device models for fallbacks, but also cloud-only models that are not availble on edge devices. These models can be used in the same way as on-device models, but they do not require a download step. You can use them by specifying the modelCode parameter in the same way.

// You have already registered a device session

FreeToken.shared.generateCompletion(
  modelCode: "deepseek_r1_0528_cloud",
  prompt: "Hybrid AI is ",
  success: { response in
      print("Completion: \(response.text)")
  },
  error: { error in 
    print("Error generating completion: \(error.localizedDescription)")
  }
)