Old Way vs New Way (With Tools)

Your Understanding is Correct for the OLD WAY

You’re absolutely right! The basic OpenAI API works like this:

OLD WAY (Basic Chat - Still Works!)

Request:

 
{
 
  "model": "gpt-4",
 
  "messages": [
 
    { "role": "system", "content": "You are a helpful assistant" },
 
    { "role": "user", "content": "Explain DevOps in 1 line" }
 
  ],
 
  "temperature": 0.7,
 
  "max_tokens": 50
 
}
 

Response:

 
{
 
  "id": "chatcmpl-XYZ",
 
  "object": "chat.completion",
 
  "choices": [{
 
    "message": {
 
      "role": "assistant",
 
      "content": "DevOps is a culture of automating and improving software delivery."
 
    },
 
    "finish_reason": "stop"
 
  }],
 
  "usage": {
 
    "prompt_tokens": 20,
 
    "completion_tokens": 12,
 
    "total_tokens": 32
 
  }
 
}
 

Flow:

  1. Send text → AI generates text → Parse response ✅
  2. That’s it! Simple.

NEW WAY (With Tools - Added by OpenAI in 2023)

OpenAI added a new optional feature called “Function Calling” (now “Tools”) to the same API endpoint.

What Changed?

Request (with tools added):

 
{
 
  "model": "gpt-4",
 
  "messages": [
 
    { "role": "system", "content": "You are a helpful assistant" },
 
    { "role": "user", "content": "What's the weather in Boston?" }
 
  ],
 
  "tools": [                           // ← NEW! Optional parameter
 
    {
 
      "type": "function",
 
      "function": {
 
        "name": "get_weather",
 
        "description": "Get current weather",
 
        "parameters": {
 
          "type": "object",
 
          "properties": {
 
            "location": { "type": "string" }
 
          },
 
          "required": ["location"]
 
        }
 
      }
 
    }
 
  ],
 
  "tool_choice": "auto"                // ← NEW! Optional parameter
 
}
 

Response (AI wants to use a tool):

 
{
 
  "id": "chatcmpl-ABC",
 
  "object": "chat.completion",
 
  "choices": [{
 
    "message": {
 
      "role": "assistant",
 
      "content": null,                 // ← No text response!
 
      "tool_calls": [                  // ← NEW! AI is requesting to call a function
 
        {
 
          "id": "call_123",
 
          "type": "function",
 
          "function": {
 
            "name": "get_weather",
 
            "arguments": "{\"location\": \"Boston\"}"
 
          }
 
        }
 
      ]
 
    },
 
    "finish_reason": "tool_calls"      // ← NEW! Indicates AI wants to use tools
 
  }]
 
}
 

Flow:

  1. Send text + tool definitions → AI decides to use a tool
  2. AI returns structured JSON (not text!) telling you which function to call
  3. YOU execute the function in your code
  4. Send the result back to AI
  5. AI generates final text response

The Key Difference

OLD WAY (Basic Chat)


You: "What's the weather in Boston?"

AI: "I don't have access to real-time weather data, but you can check..."

AI just generates text. It can’t DO anything.

NEW WAY (With Tools)


You: "What's the weather in Boston?"

AI: [Returns JSON] { "function": "get_weather", "arguments": {"location": "Boston"} }

You: [Execute function] → "72°F, sunny"

You: [Send result back to AI]

AI: "The weather in Boston is currently 72°F and sunny!"

AI can request actions, you execute them, AI uses the results.


What Does Shello Do?

Shello uses the NEW WAY to give the AI the ability to:

  • Read files
  • Edit files
  • Run bash commands
  • Search code

Without Tools (Old Way)


User: "Show me package.json"

AI: "I cannot access files. Please paste the contents..."

With Tools (Shello’s Way)


User: "Show me package.json"

AI: [tool_call] { "function": "view_file", "arguments": {"path": "package.json"} }

Shello: [Executes] fs.readFileSync("package.json")

Shello: [Returns] "{ \"name\": \"shello\", \"version\": \"1.0.0\" ... }"

AI: "Here's your package.json: ..."


Is This a Library Feature or OpenAI Feature?

It’s an OpenAI feature! The OpenAI library (npm install openai) just provides TypeScript types and convenience methods.

You Could Do This With Raw HTTP

Without Library (Raw curl):

 
curl https://api.openai.com/v1/chat/completions \
 
  -H "Authorization: Bearer $OPENAI_API_KEY" \
 
  -H "Content-Type: application/json" \
 
  -d '{
 
    "model": "gpt-4",
 
    "messages": [{"role": "user", "content": "Hello"}],
 
    "tools": [{"type": "function", "function": {...}}]
 
  }'
 

With Library (TypeScript):

 
import OpenAI from "openai";
 
const client = new OpenAI({ apiKey: "..." });
 
 
 
const response = await client.chat.completions.create({
 
  model: "gpt-4",
 
  messages: [{ role: "user", content: "Hello" }],
 
  tools: [{ type: "function", function: {...} }]
 
});
 

The library just makes it easier! The tools parameter goes directly to OpenAI’s servers.


Complete Flow Comparison

OLD WAY (Basic Chat)


┌─────────┐                    ┌─────────┐

│  Your   │  "What's 2+2?"     │ OpenAI  │

│  Code   │ ──────────────────>│   API   │

│         │                    │         │

│         │  "2+2 equals 4"    │         │

│         │ <──────────────────│         │

└─────────┘                    └─────────┘

NEW WAY (With Tools)


┌─────────┐                    ┌─────────┐

│  Your   │  "What's 2+2?"     │ OpenAI  │

│  Code   │  + [calculator]    │   API   │

│         │ ──────────────────>│         │

│         │                    │         │

│         │  tool_call:        │         │

│         │  calculate("2+2")  │         │

│         │ <──────────────────│         │

│         │                    │         │

│  [YOU   │                    │         │

│  execute│                    │         │

│  calc]  │                    │         │

│  = 4    │                    │         │

│         │                    │         │

│         │  result: "4"       │         │

│         │ ──────────────────>│         │

│         │                    │         │

│         │  "The answer is 4" │         │

│         │ <──────────────────│         │

└─────────┘                    └─────────┘


Why Did OpenAI Add This?

Before tools, people were doing this:


System: "You are a calculator. When user asks math, respond with JSON: {\"action\":\"calculate\",\"expression\":\"2+2\"}"

User: "What's 2+2?"

AI: "{\"action\":\"calculate\",\"expression\":\"2+2\"}"

Code: [Parse JSON, execute, send back]

Problems:

  • AI might not format JSON correctly
  • Had to parse unreliable text
  • No standard format

With Tools:

  • AI returns guaranteed valid JSON
  • Standard format across all apps
  • OpenAI handles the parsing logic

Summary

| Aspect | OLD WAY | NEW WAY (Tools) |

|--------|---------|-----------------|

| Request | Just messages | Messages + tool definitions |

| Response | Always text | Text OR tool_calls |

| AI Capability | Generate text only | Generate text + request actions |

| Your Code | Parse text | Execute functions |

| Use Case | Chat, Q&A | Agents, automation, file ops |

| OpenAI Feature? | ✅ Yes | ✅ Yes (added 2023) |

| Library Feature? | No, just types | No, just types |

Bottom Line:

  • The openai npm package is just a wrapper for HTTP requests
  • The tools parameter is a real OpenAI API feature
  • Shello uses this feature to let AI interact with your filesystem
  • You could build the same thing with raw HTTP requests!

The “magic” is that OpenAI trained their models to understand function schemas and generate valid tool calls. That’s what you’re paying for!