Skip to main content

Overview

Web search tools enable agents to find and retrieve information from the internet. This includes general search, news search, and advanced web scraping capabilities.

Available Search Tools

DuckDuckGoTools (Agno)

from agno.tools.duckduckgo import DuckDuckGoTools
Enable general web search functionality.
news
boolean
default:"false"
Enable news-specific search functionality.

Basic Usage

from agno.agent import Agent
from agno.models.nebius import Nebius
from agno.tools.duckduckgo import DuckDuckGoTools
import os

agent = Agent(
    name="web_agent",
    role="search the web for information based on user input",
    model=Nebius(
        id="deepseek-ai/DeepSeek-R1-0528",
        api_key=os.getenv("NEBIUS_API_KEY")
    ),
    tools=[DuckDuckGoTools(search=True, news=True)],
    instructions=[
        "You are a professional web search AI agent",
        "Search the web for accurate, up-to-date information",
        "Provide exact information available on the web"
    ]
)

response = agent.run("What are the latest developments in AI?")
print(response.content)

WebSearchService (TypeScript)

For TypeScript projects using the hedge fund example architecture:
import { WebSearchService } from './services/utils/WebSearchService';
apiKey
string
default:"process.env.SERPER_API_KEY"
API key for the Serper search service.
searchUrl
string
default:"'https://google.serper.dev/search'"
Base URL for the search API endpoint.

Constructor

const searchService = new WebSearchService(
    apiKey?: string,
    searchUrl?: string
);

Methods

search()
Performs a web search and returns structured results.
public async search(query: string): Promise<SearchResponse>
query
string
required
The search query string.
results
WebSearchResult[]
Array of search results.
success
boolean
Whether the search was successful.
error
string
Error message if the search failed.

TypeScript Types

interface SearchResponse {
  results: WebSearchResult[];
  success: boolean;
  error?: string;
}

interface WebSearchResult {
  title: string;
  snippet: string;
  url: string;
}

Example Usage

import { WebSearchService } from './services/utils/WebSearchService';

const searchService = new WebSearchService(
  process.env.SERPER_API_KEY,
  'https://google.serper.dev/search'
);

const results = await searchService.search('AI news 2026');

if (results.success) {
  results.results.forEach(result => {
    console.log(`Title: ${result.title}`);
    console.log(`Snippet: ${result.snippet}`);
    console.log(`URL: ${result.url}`);
  });
} else {
  console.error(`Search failed: ${results.error}`);
}

ScrapeGraphTools (Advanced Web Scraping)

For advanced web scraping and content extraction:
from agno.tools.scrapegraph import ScrapeGraphTools
import os

scraper = ScrapeGraphTools(api_key=os.getenv("SGAI_API_KEY"))
api_key
string
required
ScrapeGraph AI API key for advanced web scraping.

Usage in Workflows

from agno.agent import Agent
from agno.models.nebius import Nebius
from agno.tools.scrapegraph import ScrapeGraphTools
from agno.workflow import Workflow
import os

class ResearchWorkflow(Workflow):
    searcher: Agent = Agent(
        tools=[ScrapeGraphTools(api_key=os.getenv("SGAI_API_KEY"))],
        model=Nebius(
            id="deepseek-ai/DeepSeek-V3-0324",
            api_key=os.getenv("NEBIUS_API_KEY")
        ),
        show_tool_calls=True,
        markdown=True,
        description="Expert at finding and extracting information from the web",
        instructions=[
            "Search for the most recent and authoritative sources",
            "Extract key facts, statistics, and expert opinions",
            "Cover multiple perspectives and highlight controversies",
            "Include relevant statistics and data",
            "Organize findings in a clear, structured format",
            "Mention references and sources of the content"
        ]
    )

Multi-Tool Search Agent

Combine multiple search tools for comprehensive coverage:
from agno.agent import Agent
from agno.models.nebius import Nebius
from agno.tools.duckduckgo import DuckDuckGoTools
from agno.tools.yfinance import YFinanceTools
import os

web_search_agent = Agent(
    name="web_agent",
    role="search the web for information",
    model=Nebius(
        id="deepseek-ai/DeepSeek-R1-0528",
        api_key=os.getenv("NEBIUS_API_KEY")
    ),
    tools=[DuckDuckGoTools(search=True, news=True)],
    instructions=[
        "You are a professional web search AI agent",
        "Search the web for accurate information",
        "Provide exact information available on the web"
    ]
)

financial_agent = Agent(
    name="financial_agent",
    role="get financial information",
    model=Nebius(
        id="Qwen/Qwen3-32B",
        api_key=os.getenv("NEBIUS_API_KEY")
    ),
    tools=[
        YFinanceTools(
            stock_price=True,
            analyst_recommendations=True,
            stock_fundamentals=True,
            company_info=True,
            technical_indicators=True,
            historical_prices=True,
            key_financial_ratios=True,
            income_statements=True
        )
    ],
    instructions=[
        "You are a professional financial advisor AI agent",
        "Provide accurate financial information to users",
        "Include stock prices, recommendations, and fundamentals"
    ]
)

multi_agent = Agent(
    team=[web_search_agent, financial_agent],
    model=Nebius(
        id="meta-llama/Llama-3.3-70B-Instruct",
        api_key=os.getenv("NEBIUS_API_KEY")
    ),
    markdown=True
)

Search Configuration Options

Serper API Configuration

When using WebSearchService with Serper:
const response = await axios.post(searchUrl, {
  q: query,              // Search query
  gl: 'us',             // Geographic location
  hl: 'en',             // Language
  num: 10               // Number of results
}, {
  headers: {
    'X-API-KEY': apiKey,
    'Content-Type': 'application/json'
  }
});

Result Types

The search service aggregates multiple result types:
  • Organic results: Standard web search results
  • People Also Ask: Related questions and answers
  • Knowledge Graph: Structured information panels

Best Practices

  1. Enable appropriate search types: Only enable search or news based on your needs
  2. Set clear instructions: Guide the agent on how to use search results
  3. Use markdown formatting: Enable markdown=True for better result presentation
  4. Handle errors gracefully: Check success field before processing results
  5. Respect rate limits: Implement delays for high-volume searches
  6. Cache results: Store frequently accessed search results to reduce API calls

Environment Variables

# For Serper API (WebSearchService)
SERPER_API_KEY=your_serper_api_key

# For ScrapeGraph AI (advanced scraping)
SGAI_API_KEY=your_scrapegraph_api_key

Error Handling

Python (Agno)

try:
    response = agent.run("Search query")
    print(response.content)
except Exception as e:
    print(f"Search failed: {e}")

TypeScript

const result = await searchService.search(query);

if (!result.success) {
  console.error(`Search error: ${result.error}`);
  return;
}

if (result.results.length === 0) {
  console.log('No results found');
  return;
}

// Process results
result.results.forEach(item => {
  console.log(item.title, item.url);
});

Build docs developers (and LLMs) love