Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: ReAct #3998

Open
wants to merge 50 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
e94ba5b
fixes and integrations tests
phact Sep 23, 2024
518d571
WIP
phact Sep 24, 2024
ed6a71d
Set `_has_cycle_edges` to `True` for source and target vertices in cy…
ogabrielluiz Sep 5, 2024
8279f98
feat: Add `has_cycle_edges` method to Vertex class
ogabrielluiz Sep 5, 2024
7973cf3
Add `apply_on_outputs` method to Vertex for applying functions to out…
ogabrielluiz Sep 5, 2024
4a81005
Add utility to find vertices in cycles within a directed graph
ogabrielluiz Sep 5, 2024
0363278
Add unit tests for `find_cycle_vertices` utility function in graph mo…
ogabrielluiz Sep 5, 2024
bfaafc9
Add method to set cache for vertices in cycle
ogabrielluiz Sep 5, 2024
b4e1471
refactor: Update caching logic for vertices in cycles
ogabrielluiz Sep 5, 2024
0ea18d4
Refactor `find_cycle_vertices` to use NetworkX for cycle detection
ogabrielluiz Sep 5, 2024
f30a402
Refactor `find_cycle_vertices` tests to remove entry point parameter …
ogabrielluiz Sep 5, 2024
3aaf6f4
Disable cache in cycle: Update `apply_on_outputs` to handle empty out…
ogabrielluiz Sep 5, 2024
0af95c9
Add unit test to ensure output cache is disabled in graph cycles
ogabrielluiz Sep 5, 2024
bf5e834
Add unit test for graph cyclicity with prompt components and OpenAI i…
ogabrielluiz Sep 6, 2024
832d5cd
Convert `_instantiate_components_in_vertices` to async and disable ca…
ogabrielluiz Sep 6, 2024
78853cb
Add default value handling for cycle edges in vertex component
ogabrielluiz Sep 16, 2024
b375612
Switch from os.environ to os.getenv for API key retrieval in test_cyc…
ogabrielluiz Sep 18, 2024
a2bfdc8
Add __repr__ method to Edge class to indicate cycle edges with a symbol
ogabrielluiz Sep 19, 2024
35fe1bd
Refactor test_cycles.py to streamline component initialization and up…
ogabrielluiz Sep 19, 2024
55a7346
Refactor test_cycles.py to streamline component initialization and up…
ogabrielluiz Sep 23, 2024
a99808b
Refactor test to use custom serialization method instead of pickle
ogabrielluiz Sep 23, 2024
9f6de69
Add cycle_vertices property to optimize cycle detection in graph
ogabrielluiz Sep 23, 2024
88b2c9a
Enhance error message in `types.py` to include component ID for bette…
ogabrielluiz Sep 23, 2024
3d2e207
Refactor test_cycles.py to update graph configuration and assertions
ogabrielluiz Sep 23, 2024
31ff469
Add api_key_required marker to test_updated_graph_with_prompts test
ogabrielluiz Sep 23, 2024
46b9588
Add validation to require max_iterations for cyclic graphs
ogabrielluiz Sep 25, 2024
ee84cf5
merge
phact Sep 26, 2024
b8c6809
missed >>>>
phact Sep 26, 2024
8a132d5
dynamic tool selection
phact Oct 1, 2024
e0e38c0
pass thread and assistant ids
phact Oct 1, 2024
49f8d3b
react initial take
phact Oct 1, 2024
0ad0129
merge assistant-manager into react branch
phact Oct 1, 2024
02afb5b
Merge branch 'main' into feat/disablecacheincycle
ogabrielluiz Oct 1, 2024
45f5c07
ReAct test
phact Oct 2, 2024
7b6757e
merge
phact Oct 2, 2024
5fc6f98
back out inputs.py change
phact Oct 2, 2024
5eb4ad4
back out inputs.py change
phact Oct 2, 2024
ffd64f0
add shared component cache
phact Oct 3, 2024
1844202
merge
phact Oct 3, 2024
e452c4f
uv.lock
phact Oct 3, 2024
186a5cd
[autofix.ci] apply automated fixes
autofix-ci[bot] Oct 3, 2024
0604dc6
some ruff style fixes
phact Oct 3, 2024
cf28a6f
Merge branch 'react' of github.com:phact/langflow into react
phact Oct 3, 2024
6de5704
INP001
phact Oct 3, 2024
21da797
Update src/backend/base/langflow/services/shared_component_cache/fact…
phact Oct 3, 2024
cb49a7d
Update src/backend/base/langflow/services/shared_component_cache/serv…
phact Oct 3, 2024
051354a
[autofix.ci] apply automated fixes
autofix-ci[bot] Oct 3, 2024
66765da
make the cache private, fix service/factory
phact Oct 3, 2024
29880f9
[autofix.ci] apply automated fixes
autofix-ci[bot] Oct 3, 2024
d8c3509
Merge branch 'langflow-ai:main' into react
phact Oct 3, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ dependencies = [
"yfinance>=0.2.40",
"langchain-google-community==1.0.7",
"wolframalpha>=5.1.3",
"astra-assistants>=2.1.2",
"astra-assistants>=2.1.4",
"composio-langchain==0.5.9",
"spider-client>=0.0.27",
"nltk>=3.9.1",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from .astra_assistant_manager import AstraAssistantManager
from .create_assistant import AssistantsCreateAssistant
from .create_thread import AssistantsCreateThread
from .dotenv import Dotenv
Expand All @@ -7,6 +8,7 @@
from .run import AssistantsRun

__all__ = [
"AstraAssistantManager",
"AssistantsCreateAssistant",
"AssistantsGetAssistantName",
"AssistantsListAssistants",
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
import asyncio

from astra_assistants.astra_assistants_manager import AssistantManager

from langflow.components.astra_assistants.util import (
get_patched_openai_client,
litellm_model_names,
tool_names,
tools_and_names,
)
from langflow.custom.custom_component.component_with_cache import ComponentWithCache
from langflow.inputs import DropdownInput, MultilineInput, StrInput
from langflow.schema.message import Message
from langflow.template import Output


class AstraAssistantManager(ComponentWithCache):
display_name = "Astra Assistant Manager"
description = "Manages Assistant Interactions"
icon = "bot"

inputs = [
StrInput(
name="instructions",
display_name="Instructions",
info="Instructions for the assistant, think of these as the system prompt.",
),
DropdownInput(
name="model_name",
display_name="Model Name",
advanced=False,
options=litellm_model_names,
value="gpt-4o-mini",
),
DropdownInput(
display_name="Tool",
name="tool",
options=tool_names,
),
Comment on lines +35 to +39
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, the tools would be Components.

What if these tools work as default ones?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe there's a way to do both, pick from the dropdown if the tools are simple and don't have any configuration or add an edge from a component for more complex tools. We would probably also want a way to accept langchain basetool tools for assistants dynamically.

Copy link
Collaborator Author

@phact phact Oct 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should also be able to pass more than one tool and let it decide which to use. Is there a multi-select input in langflow?

MultilineInput(
name="user_message",
display_name="User Message",
info="User message to pass to the run.",
),
MultilineInput(
name="input_thread_id",
display_name="Thread ID (optional)",
info="ID of the thread",
),
MultilineInput(
name="input_assistant_id",
display_name="Assistant ID (optional)",
info="ID of the assistant",
),
MultilineInput(
name="env_set",
display_name="Environment Set",
info="Dummy input to allow chaining with Dotenv Component.",
),
]

outputs = [
Output(display_name="Assistant Response", name="assistant_response", method="get_assistant_response"),
Output(display_name="Tool output", name="tool_output", method="get_tool_output"),
Output(display_name="Thread Id", name="output_thread_id", method="get_thread_id"),
Output(display_name="Assistant Id", name="output_assistant_id", method="get_assistant_id"),
]

def __init__(self, **kwargs):
super().__init__(**kwargs)
self.lock = asyncio.Lock()
self.initialized = False
self.assistant_response = None
self.tool_output = None
self.thread_id = None
self.assistant_id = None
self.client = get_patched_openai_client(self._shared_component_cache)

async def get_assistant_response(self) -> Message:
await self.initialize()
return self.assistant_response

async def get_tool_output(self) -> Message:
await self.initialize()
return self.tool_output

async def get_thread_id(self) -> Message:
await self.initialize()
return self.thread_id

async def get_assistant_id(self) -> Message:
await self.initialize()
return self.assistant_id

async def initialize(self):
async with self.lock:
if not self.initialized:
await self.process_inputs()
self.initialized = True

async def process_inputs(self):
print(f"env_set is {self.env_set}")
print(self.tool)
tools = []
tool_obj = None
if self.tool is not None and self.tool != "":
tool_cls = tools_and_names[self.tool]
tool_obj = tool_cls()
tools.append(tool_obj)
assistant_id = None
thread_id = None
if self.input_assistant_id:
assistant_id = self.input_assistant_id
if self.input_thread_id:
thread_id = self.input_thread_id
assistant_manager = AssistantManager(
instructions=self.instructions,
model=self.model_name,
name="managed_assistant",
tools=tools,
client=self.client,
thread_id=thread_id,
assistant_id=assistant_id,
)

content = self.user_message
result = await assistant_manager.run_thread(content=content, tool=tool_obj)
self.assistant_response = Message(text=result["text"])
if "decision" in result:
self.tool_output = Message(text=str(result["decision"].is_complete))
else:
self.tool_output = Message(text=result["text"])
self.thread_id = Message(text=assistant_manager.thread.id)
self.assistant_id = Message(text=assistant_manager.assistant.id)
Original file line number Diff line number Diff line change
@@ -1,17 +1,14 @@
from astra_assistants import patch # type: ignore
from openai import OpenAI

from langflow.custom import Component
from langflow.components.astra_assistants.util import get_patched_openai_client
from langflow.custom.custom_component.component_with_cache import ComponentWithCache
from langflow.inputs import MultilineInput, StrInput
from langflow.schema.message import Message
from langflow.template import Output


class AssistantsCreateAssistant(Component):
class AssistantsCreateAssistant(ComponentWithCache):
icon = "bot"
display_name = "Create Assistant"
description = "Creates an Assistant and returns it's id"
client = patch(OpenAI())

inputs = [
StrInput(
Expand Down Expand Up @@ -46,6 +43,10 @@ class AssistantsCreateAssistant(Component):
Output(display_name="Assistant ID", name="assistant_id", method="process_inputs"),
]

def __init__(self, **kwargs):
super().__init__(**kwargs)
self.client = get_patched_openai_client(self._shared_component_cache)

def process_inputs(self) -> Message:
print(f"env_set is {self.env_set}")
assistant = self.client.beta.assistants.create(
Expand Down
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
from astra_assistants import patch # type: ignore
from openai import OpenAI

from langflow.custom import Component
from langflow.components.astra_assistants.util import get_patched_openai_client
from langflow.custom.custom_component.component_with_cache import ComponentWithCache
from langflow.inputs import MultilineInput
from langflow.schema.message import Message
from langflow.template import Output


class AssistantsCreateThread(Component):
class AssistantsCreateThread(ComponentWithCache):
display_name = "Create Assistant Thread"
description = "Creates a thread and returns the thread id"
client = patch(OpenAI())

inputs = [
MultilineInput(
Expand All @@ -24,6 +21,10 @@ class AssistantsCreateThread(Component):
Output(display_name="Thread ID", name="thread_id", method="process_inputs"),
]

def __init__(self, **kwargs):
super().__init__(**kwargs)
self.client = get_patched_openai_client(self._shared_component_cache)

def process_inputs(self) -> Message:
thread = self.client.beta.threads.create()
thread_id = thread.id
Expand Down
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
from astra_assistants import patch # type: ignore
from openai import OpenAI

from langflow.custom import Component
from langflow.components.astra_assistants.util import get_patched_openai_client
from langflow.custom.custom_component.component_with_cache import ComponentWithCache
from langflow.inputs import MultilineInput, StrInput
from langflow.schema.message import Message
from langflow.template import Output


class AssistantsGetAssistantName(Component):
class AssistantsGetAssistantName(ComponentWithCache):
display_name = "Get Assistant name"
description = "Assistant by id"
client = patch(OpenAI())

inputs = [
StrInput(
Expand All @@ -29,6 +26,10 @@ class AssistantsGetAssistantName(Component):
Output(display_name="Assistant Name", name="assistant_name", method="process_inputs"),
]

def __init__(self, **kwargs):
super().__init__(**kwargs)
self.client = get_patched_openai_client(self._shared_component_cache)

def process_inputs(self) -> Message:
assistant = self.client.beta.assistants.retrieve(
assistant_id=self.assistant_id,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,20 +1,21 @@
from astra_assistants import patch # type: ignore
from openai import OpenAI

from langflow.custom import Component
from langflow.components.astra_assistants.util import get_patched_openai_client
from langflow.custom.custom_component.component_with_cache import ComponentWithCache
from langflow.schema.message import Message
from langflow.template.field.base import Output


class AssistantsListAssistants(Component):
class AssistantsListAssistants(ComponentWithCache):
display_name = "List Assistants"
description = "Returns a list of assistant id's"
client = patch(OpenAI())

outputs = [
Output(display_name="Assistants", name="assistants", method="process_inputs"),
]

def __init__(self, **kwargs):
super().__init__(**kwargs)
self.client = get_patched_openai_client(self._shared_component_cache)

def process_inputs(self) -> Message:
assistants = self.client.beta.assistants.list().data
id_list = [assistant.id for assistant in assistants]
Expand Down
11 changes: 8 additions & 3 deletions src/backend/base/langflow/components/astra_assistants/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,17 +4,22 @@
from openai import OpenAI
from openai.lib.streaming import AssistantEventHandler

from langflow.custom import Component
from langflow.components.astra_assistants.util import get_patched_openai_client
from langflow.custom.custom_component.component_with_cache import ComponentWithCache
from langflow.inputs import MultilineInput
from langflow.schema import dotdict
from langflow.schema.message import Message
from langflow.template import Output


class AssistantsRun(Component):
class AssistantsRun(ComponentWithCache):
display_name = "Run Assistant"
description = "Executes an Assistant Run against a thread"
client = patch(OpenAI())

def __init__(self, **kwargs):
super().__init__(**kwargs)
self.client = get_patched_openai_client(self._shared_component_cache)
self.thread_id = None

def update_build_config(
self,
Expand Down
Loading
Loading