# Agno



Agno's message format is OpenAI-compatible, so it works directly with Acontext.

## Quick Start [#quick-start]

```bash
acontext create my-agno-project --template-path "python/agno-basic"
```

<Note>
  Install CLI first: `curl -fsSL https://install.acontext.io | sh`
</Note>

## Manual Setup [#manual-setup]

<Steps>
  <Step title="Install dependencies">
    ```bash
    pip install agno acontext python-dotenv
    ```
  </Step>

  <Step title="Configure environment">
    ```bash
    OPENAI_API_KEY=your_openai_key_here
    ACONTEXT_API_KEY=sk-ac-your-api-key
    ```
  </Step>

  <Step title="Run agent with Acontext">
    <Accordion title="Complete example">
      ```python
      import os
      from agno.agent import Agent
      from agno.models.openai import OpenAIChat
      from agno.tools import tool
      from acontext import AcontextClient

      client = AcontextClient(
          api_key=os.getenv("ACONTEXT_API_KEY"),
      )

      # If you're using self-hosted Acontext:
      # client = AcontextClient(
      #     base_url="http://localhost:8029/api/v1",
      #     api_key="sk-ac-your-root-api-bearer-token",
      # )

      @tool
      def get_weather(city: str) -> str:
          """Returns weather info for the specified city."""
          return f"The weather in {city} is sunny"

      agent = Agent(
          name="Assistant",
          model=OpenAIChat(id="gpt-4"),
          instructions="You are a helpful assistant",
          tools=[get_weather],
      )

      # Create session
      session = client.sessions.create()
      conversation = []

      # User message
      user_msg = {"role": "user", "content": "Plan a 3-day trip to Finland"}
      conversation.append(user_msg)
      client.sessions.store_message(session_id=session.id, blob=user_msg)

      # Run agent
      response = agent.run(conversation)

      # Store response
      assistant_msg = {"role": "assistant", "content": response.content}
      conversation.append(assistant_msg)
      client.sessions.store_message(session_id=session.id, blob=assistant_msg)

      # Extract tasks
      client.sessions.flush(session.id)
      tasks = client.sessions.get_tasks(session.id)

      for task in tasks.items:
          print(f"Task: {task.data.task_description} | Status: {task.status}")
      ```
    </Accordion>
  </Step>
</Steps>

## Resume Sessions [#resume-sessions]

```python
messages = client.sessions.get_messages(session_id)
conversation = messages.items

conversation.append({"role": "user", "content": "Continue from where we left off"})
response = agent.run(conversation)
```

## Next Steps [#next-steps]

<CardGroup cols="2">
  <Card title="Task Tracking" icon="radar" href="/observe/agent_tasks">
    Monitor agent tasks and progress
  </Card>

  <Card title="Dashboard" icon="chart-simple" href="/observe/dashboard">
    View all interactions
  </Card>
</CardGroup>
