How to control your 🚁 drone using #Azure OpenAI or OpenAI APIs and Semantic Kernel

El Bruno - Aug 3 '23 - - Dev Community

Coding4Fun Drone 🚁 posts

  1. Introduction to DJI Tello
  2. Analyzing Python samples code from the official SDK
  3. Drone Hello World ! Takeoff and land
  4. Tips to connect to Drone WiFi in Windows 10
  5. Reading data from the Drone, Get battery level
  6. Sample for real time data read, Get Accelerometer data
  7. How the drone camera video feed works, using FFMPEG to display the feed
  8. Open the drone camera video feed using OpenCV
  9. Performance and OpenCV, measuring FPS
  10. Detect faces using the drone camera
  11. Detect a banana and land!
  12. Flip when a face is detected!
  13. How to connect to Internet and to the drone at the same time
  14. Video with real time demo using the drone, Python and Visual Studio Code
  15. Using custom vision to analyze drone camera images
  16. Drawing frames for detected objects in real-time in the drone camera feed
  17. Save detected objects to local files, images and JSON results
  18. Save the Drone camera feed into a local video file
  19. Overlay images into the Drone camera feed using OpenCV
  20. Instance Segmentation from the Drone Camera using OpenCV, TensorFlow and PixelLib
  21. Create a 3Γ—3 grid on the camera frame to detect objects and calculate positions in the grid
  22. Create an Azure IoT Central Device Template to work with drone information
  23. Create a Drone Device for Azure IoT Central
  24. Send drone information to Azure IoT Central
  25. Using GPT models to generate code to control the drone. Using ChatGPT
  26. Generate code to control the 🚁 drone using Azure OpenAI Services or OpenAI APIs, and Semantic Kernel

Hi!

In my previous post, I wrote about how we can use ChatGPT to generate Python code to control a drone.

Today, we are going to use Azure OpenAI Services or OpenAI APIs to generate the same code using GPT models.

We are going to use the same prompt that we had for ChatGPT.


Use this python code as reference 

[CODE START]
# display battery level
sendReadCommand('battery?')
print(f'Battery: {battery} %')

# take off
sendCommand("takeoff") 

# flip the drone left
sendCommand("flip l") 

# move drone up 5 cms
sendCommand("up 5") 

# move drone left 5 cms
sendCommand("left 5") 

# rotate drone clock wise 90 degrees
sendCommand("cw 90") 

# land the drone
sendCommand("land") 

[CODE END]

Generate code to takeoff the drone, flip right, move down 30 centimeters and land

Enter fullscreen mode Exit fullscreen mode

Running this prompt in Azure OpenAI Studio generates the correct Python code:

Running this prompt in Azure OpenAI Studio generates the correct Python code:

Both Azure OpenAI Services and OpenAI APIs have amazing SDKs, however, I’ll use semantic kernel to have a single code base that can use both.

_ Note: This is the best way to learn semantic kernel for NET and Python:_

Start learning how to use Semantic Kernel.

Now, let’s create a simple function that:

  • Use semantic kernel to generate drone commands
  • Use Azure OpenAI Services or OpenAI APIs to generate drone commands
  • Use the semantic skill β€œDroneAI” to generate drone commands
  • Return the generated drone commands as a string

Here is the function code:

And, I need to have a Skill directory with my DroneAI skill. Something like this:

folder structure for a semantic kernel skill

Here is the config.json content:


{
  "schema": 1,
  "description": "Generate commands to control the drone",
  "type": "completion",
  "completion": {
    "max_tokens": 1000,
    "temperature": 0.0,
    "top_p": 1.0,
    "presence_penalty": 0.0,
    "frequency_penalty": 0.0
  },
  "input": {
    "parameters": [
      {
        "name": "input",
        "description": "commands for the drone",
        "defaultValue": ""
      }
    ]
  }
}

Enter fullscreen mode Exit fullscreen mode

And the skprompt.txt:


Use this python code as reference 

[CODE START]
# display battery level
sendReadCommand('battery?')
print(f'Battery: {battery} %')

# take off
sendCommand("takeoff") 

# flip the drone left
sendCommand("flip l") 

# move drone up 5 cms
sendCommand("up 5") 

# move drone left 5 cms
sendCommand("left 5") 

# rotate drone clock wise 90 degrees
sendCommand("cw 90") 

# land the drone
sendCommand("land")
[CODE END]

Generate python code only to follow these orders
{{$input}}
+++++

Enter fullscreen mode Exit fullscreen mode

And that’s it! We can test this function with 2 unit tests for for platforms:


import unittest
from DroneGenCommandsWithAI import generate_drone_commands

class TestGenerateDroneCommands(unittest.TestCase):

    test_command = "takeoff the drone, flip right, move down 30 centimeters and land the drone"

    def test_generate_drone_commandsusing_openAI(self):
        print('###############################################')
        print(' Test using OpenAI APIs')
        commands = generate_drone_commands(self.test_command)

        # validate if the generated string contains the drone commands
        self.assertIn('takeoff', commands)
        self.assertIn('flip r', commands)
        self.assertIn('down 30', commands)
        self.assertIn('land', commands)
        print('###############################################')

    def test_generate_drone_commandsusing_azureOpenAI(self):
        print('###############################################')
        print(' Test using AZURE OpenAI Services')
        commands = generate_drone_commands(self.test_command, True)

        # validate if the generated string contains the drone commands
        self.assertIn('takeoff', commands)
        self.assertIn('flip r', commands)
        self.assertIn('down 30', commands)
        self.assertIn('land', commands)

if __name__ == ' __main__':
    unittest.main()

Enter fullscreen mode Exit fullscreen mode

And, they both works!

test successfully run

In the next post we will review some options and configurations necessary for the correct code generation.

Happy coding!

Greetings

El Bruno

More posts in my blog ElBruno.com.

More info in https://beacons.ai/elbruno


. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player