Thread

SI
Sasha Ilatovskii6:51 PMOpen in Slack
Hey team 👋. I'm Sasha)
I’m interested in contributing to this issue: https://github.com/archestra-ai/archestra/issues/3838
Before I jump in, I wanted to ask - is there already a defined vision or some expectations for how this should be approached? Or is this more of an open-ended task where contributors are encouraged to propose their own direction/implementation?
Happy to align with any existing ideas, or draft a proposal if it’s still exploratory.
🙌1👋1

7 replies
SI
Sasha Ilatovskii2:18 PMOpen in Slack
Not sure who’s the best person to tag here, so I’m mentioning @user cause you opened this issue.
I noticed an issue with the current chat component behavior and wanted to document it.
If a user sends a message to the agent and the response takes a long time (e.g., due to a long-running task), then after refreshing the page the client does not reconnect to the ongoing event stream. As a result, the user won’t see any intermediate updates or the response in progress. The final response only becomes visible after the task fully completes and the page is refreshed again.
This creates a gap where the user effectively loses visibility into the active conversation until completion.
Would be great to confirm if this is expected behavior or something we should address (e.g., by restoring the event stream or syncing state on reload).
J(
joey (archestra team)3:58 PMOpen in Slack
is this more of an open-ended task where contributors are encouraged to propose their own direction/implementation
hey! it's very much the latter right now 🙂
at a "high-level" I would share that the chat UI/UX has some generic "rough-edges" (if you compare it to say Claude Desktop or ChatGPT applications)
There's some pre-built components in vercel AI elements which could be "low hanging fruit" to step-up the UX of the chat (just some quick examples are maybe Chain of Thought and/or Reasoning) - but beyond that I would prefer leaving this up to the implementor to define a list of what you think are the highest ROI chat rough-edges to tackle/implement.
🙌1
J(
joey (archestra team)3:58 PMOpen in Slack
does that answer your question a bit / give some direction?
SI
Sasha Ilatovskii4:11 PMOpen in Slack
Yeah, that helps a bit as a starting point, thanks!
Right now I’m trying to get a better understanding of what building blocks already exist in the current version of the chat. I’ve also put together a small demo to explore how different blocks get rendered based on the agent’s responses — mainly to see what’s already possible and where the gaps are.
Planning to use that as a baseline before proposing anything
❤️1
SI
Sasha Ilatovskii4:13 PMOpen in Slack
and what about my second message in this thread. are you interested in "restoring the event stream" for long-term agent tasks and responses?
👀1
J(
joey (archestra team)5:16 PMOpen in Slack
If a user sends a message to the agent and the response takes a long time (e.g., due to a long-running task), then after refreshing the page the client does not reconnect to the ongoing event stream. As a result, the user won’t see any intermediate updates or the response in progress. The final response only becomes visible after the task fully completes and the page is refreshed again
absolutely - the end-state you proposed makes sense. I am not sure myself if that's possible, but if it is, and you're able to demonstrate the change, I would say it'd be a huge improvement
(it could very well be possible btw - i haven't been very deep into the chat streaming code in quite some time)
SI
Sasha Ilatovskii5:19 PMOpen in Slack
Yeah, I actually worked on something similar recently 🙂
In that case I was able to restore the connection to the ongoing stream after a refresh and continue receiving updates, so it should be doable in principle. I’ll try to adapt that approach here and see how well it fits into the current setup — will share a demo if I get it working.
:archestra-love:1