Use any LLM with Just 8 Lines of Code πŸš€

Durvesh Danve - Aug 9 - - Dev Community

Ever wondered how easy it could be to harness the power of cutting-edge AI models in your projects?

With just 8 lines of Python code, you can start using a powerful Large Language Model (LLM) without diving into the complexities of training one from scratch.

Let’s see how!

Tools we'll be using:

1. Huggingface pretrained model (in this case, falcon)
2. Python
3. Langchain
4. Google Colab
Enter fullscreen mode Exit fullscreen mode

First, open Google Colab and create a new notebook.

Let's start coding:

Step 1:
Install the necessary libraries:

!!pip install langchain huggingface_hub langchain_community
Enter fullscreen mode Exit fullscreen mode

Step 2:
Set up your Hugging Face API token as an environment variable:

import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"
Enter fullscreen mode Exit fullscreen mode

To get your token:

  1. Visit Hugging Face and sign in or create an account.
  2. Navigate to the settings page and select the Access Token tab.
  3. Create a token and replace "YOUR_TOKEN" with your actual token.

Huggingface Access Token section

Step 3:
Import HuggingFaceHub from langchain :

from langchain import HuggingFaceHub
Enter fullscreen mode Exit fullscreen mode

Initialize your Large Language Model (LLM):

llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})
Enter fullscreen mode Exit fullscreen mode

I’m using the tiiuae/falcon-7b-instruct model here, but there are plenty of models available. You can explore them here.

Let’s test the model:

prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
Enter fullscreen mode Exit fullscreen mode

and this results into :

def fibonacci(n):
    if n == 0:
        return 0
    elif n == 1:
        return 1
    else:
        return fibonacci(n - 1)+fibonacci(n - 2)
Enter fullscreen mode Exit fullscreen mode

And just like that, with only 8 lines of code, we’ve set up our own version of ChatGPT! πŸŽ‰πŸ’»

Complete Code

# Install necessary libraries
!pip install langchain huggingface_hub langchain_community

import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"

from langchain import HuggingFaceHub

# Initialize the model
llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})

# Use the model to generate a response
prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
Enter fullscreen mode Exit fullscreen mode
. .
Terabox Video Player