-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for thinking LLMs directly in gr.ChatInterface
#10305
Merged
Merged
Changes from all commits
Commits
Show all changes
98 commits
Select commit
Hold shift + click to select a range
cc5c2cb
ungroup thoughts from messages
hannahblair 773db7f
rename messagebox to thought
hannahblair 2b8da48
refactor
hannahblair 281ff2a
* add metadata typing
hannahblair 915439e
tweaks
hannahblair 134a7b7
tweak
hannahblair 462f394
add changeset
gradio-pr-bot 894d5bd
Merge branch 'thought-ui' of github.com:gradio-app/gradio into though…
hannahblair 56fcac2
fix expanded rotation
hannahblair ea9eaf3
border radius
hannahblair 1e731b7
update thought design
hannahblair bde1c69
move spinner
hannahblair 983059a
Merge branch 'main' into thought-ui
hannahblair 9aedecc
Merge branch 'main' into thought-ui
hannahblair 8d2aa02
prevent circular reference
hannahblair 653f66e
revert border removal
hannahblair 220804c
css tweaks
hannahblair e40f69a
border tweak
hannahblair 9d4e00a
move chevron to the left
hannahblair bd47db8
tweak nesting logic
hannahblair 811429d
thought group spacing
hannahblair fa2caa4
update run.py
hannahblair 1cdaf40
icon changes
abidlabs 1d2129d
format
abidlabs cc9ef78
add changeset
gradio-pr-bot cc31a37
Merge branch 'main' into thought-ui
abidlabs 259a8ea
add nested thought demo
abidlabs 4e117eb
changes
abidlabs ebafced
changes
abidlabs b928677
changes
abidlabs bc1ba8c
add demo
abidlabs 105009b
docs
abidlabs e77db20
guide
abidlabs 87d79b5
refactor styles and clean up logic
hannahblair 8914bc5
Merge branch 'thought-ui' of github.com:gradio-app/gradio into though…
hannahblair 863a794
revert demo change and and deeper nested thought to demo
hannahblair 0469e48
add optional duration to message types
hannahblair 340d980
add nested thoughts story
hannahblair 39e3e0c
format
hannahblair 6fa2005
guide
abidlabs e4c6fc2
Merge branch 'thought-ui' into thought-ci
abidlabs ce07e83
change dropdown icon button
hannahblair 0f7515a
remove requirement for id's in nested thoughts
hannahblair 93972a8
support markdown in thought title
hannahblair 7de8b1d
get thought content in copied value
hannahblair 02eefbc
add funcs to utils
hannahblair fc993c4
move is_all_text
hannahblair e85b242
remove comment
hannahblair 519351a
Merge branch 'main' into thought-ui
hannahblair af59ddc
Merge branch 'main' into thought-ui
hannahblair 9cf679d
Merge branch 'thought-ui' into thought-ci
abidlabs 7bf0e7d
notebook
abidlabs 57afd35
Merge branch 'thought-ui' into thought-ci
abidlabs 8f85a10
change bot padding
hannahblair 7c0bb48
Merge branch 'thought-ui' into thought-ci
abidlabs cc224db
changes
abidlabs c8c6668
Merge branch 'thought-ci' of github.com:gradio-app/gradio into though…
abidlabs 8ed7aa7
changes
abidlabs 5c77033
changes
abidlabs 0846df2
panel css fix
hannahblair 52615f9
Merge branch 'thought-ui' into thought-ci
abidlabs 7a8b4e9
changes
abidlabs 2b0f98e
Merge branch 'thought-ci' of github.com:gradio-app/gradio into though…
abidlabs b2a23a3
changes
abidlabs 294ebe6
changes
abidlabs 2811955
changes
abidlabs 369d7a3
tweak thought content opacity
hannahblair fc05ff3
more changes
abidlabs ef41371
Merge branch 'main' into thought-ci
abidlabs 74ac11e
add changeset
gradio-pr-bot ea0300f
changes
abidlabs 54b20c5
Merge branch 'thought-ci' of github.com:gradio-app/gradio into though…
abidlabs 86062fe
restore
abidlabs 1cd931e
changes
abidlabs dc95cbe
changes
abidlabs 430a6ba
revert everythign
abidlabs 94d2053
revert everythign
abidlabs 2290fdf
revert
abidlabs ec7ea8e
changes
abidlabs 793fc4d
revert
abidlabs 82f77e5
make changes to demo
abidlabs dd21c41
notebooks
abidlabs 2b180c3
more docs
abidlabs 75b74ae
format
abidlabs 1d74544
changes
abidlabs eea54bf
changes
abidlabs fd0969c
update demo
abidlabs 085ee60
fix typing issues
abidlabs 8f0dbce
chatbot
abidlabs d1a0d76
document chatmessage helper class
aliabd edb7b40
add changeset
gradio-pr-bot 0c4cfb0
Merge branches 'thought-ci' and 'thought-ci' of github.com:gradio-app…
hannahblair e061be1
changes
abidlabs 8ad9dbd
format
abidlabs edb057b
docs
abidlabs 8607dcc
Merge branch 'main' into thought-ci
abidlabs 19c2b1e
fix issue with chatmessage
aliabd e229dfa
Merge branch 'thought-ci' of https://github.com/gradio-app/gradio int…
aliabd File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
--- | ||
"@gradio/chatbot": minor | ||
"gradio": minor | ||
"website": minor | ||
--- | ||
|
||
feat:Add support for thinking LLMs directly in `gr.ChatInterface` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
{"cells": [{"cell_type": "markdown", "id": "302934307671667531413257853548643485645", "metadata": {}, "source": ["# Gradio Demo: chatinterface_nested_thoughts"]}, {"cell_type": "code", "execution_count": null, "id": "272996653310673477252411125948039410165", "metadata": {}, "outputs": [], "source": ["!pip install -q gradio "]}, {"cell_type": "code", "execution_count": null, "id": "288918539441861185822528903084949547379", "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "from gradio import ChatMessage\n", "import time\n", "\n", "sleep_time = 0.1\n", "long_sleep_time = 1\n", "\n", "def generate_response(message, history):\n", " start_time = time.time()\n", " responses = [\n", " ChatMessage(\n", " content=\"In order to find the current weather in San Francisco, I will need to use my weather tool.\",\n", " )\n", " ]\n", " yield responses\n", " time.sleep(sleep_time)\n", "\n", " main_thought = ChatMessage(\n", " content=\"\",\n", " metadata={\"title\": \"Using Weather Tool\", \"id\": 1, \"status\": \"pending\"},\n", " )\n", "\n", " responses.append(main_thought)\n", "\n", " yield responses\n", " time.sleep(long_sleep_time)\n", " responses[-1].content = \"Will check: weather.com and sunny.org\"\n", " yield responses\n", " time.sleep(sleep_time)\n", " responses.append(\n", " ChatMessage(\n", " content=\"Received weather from weather.com.\",\n", " metadata={\"title\": \"Checking weather.com\", \"parent_id\": 1, \"id\": 2, \"duration\": 0.05},\n", " )\n", " )\n", " yield responses\n", "\n", " sunny_start_time = time.time()\n", " time.sleep(sleep_time)\n", " sunny_thought = ChatMessage(\n", " content=\"API Error when connecting to sunny.org \ud83d\udca5\",\n", " metadata={\"title\": \"Checking sunny.org\", \"parent_id\": 1, \"id\": 3, \"status\": \"pending\"},\n", " )\n", "\n", " responses.append(sunny_thought)\n", " yield responses\n", "\n", " time.sleep(sleep_time)\n", " responses.append(\n", " ChatMessage(\n", " content=\"Failed again\",\n", " metadata={\"title\": \"I will try again\", \"id\": 4, \"parent_id\": 3, \"duration\": 0.1},\n", "\n", " )\n", " )\n", " sunny_thought.metadata[\"status\"] = \"done\"\n", " sunny_thought.metadata[\"duration\"] = time.time() - sunny_start_time\n", "\n", " main_thought.metadata[\"status\"] = \"done\"\n", " main_thought.metadata[\"duration\"] = time.time() - start_time\n", "\n", " yield responses\n", "\n", " time.sleep(long_sleep_time)\n", "\n", " responses.append(\n", " ChatMessage(\n", " content=\"Based on the data only from weather.com, the current weather in San Francisco is 60 degrees and sunny.\",\n", " )\n", " )\n", " yield responses\n", "\n", "demo = gr.ChatInterface(\n", " generate_response,\n", " type=\"messages\",\n", " title=\"Nested Thoughts Chat Interface\",\n", " examples=[\"What is the weather in San Francisco right now?\"]\n", ")\n", "\n", "if __name__ == \"__main__\":\n", " demo.launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,81 @@ | ||
import gradio as gr | ||
from gradio import ChatMessage | ||
import time | ||
|
||
sleep_time = 0.1 | ||
long_sleep_time = 1 | ||
|
||
def generate_response(message, history): | ||
start_time = time.time() | ||
responses = [ | ||
ChatMessage( | ||
content="In order to find the current weather in San Francisco, I will need to use my weather tool.", | ||
) | ||
] | ||
yield responses | ||
time.sleep(sleep_time) | ||
|
||
main_thought = ChatMessage( | ||
content="", | ||
metadata={"title": "Using Weather Tool", "id": 1, "status": "pending"}, | ||
) | ||
|
||
responses.append(main_thought) | ||
|
||
yield responses | ||
time.sleep(long_sleep_time) | ||
responses[-1].content = "Will check: weather.com and sunny.org" | ||
yield responses | ||
time.sleep(sleep_time) | ||
responses.append( | ||
ChatMessage( | ||
content="Received weather from weather.com.", | ||
metadata={"title": "Checking weather.com", "parent_id": 1, "id": 2, "duration": 0.05}, | ||
) | ||
) | ||
yield responses | ||
|
||
sunny_start_time = time.time() | ||
time.sleep(sleep_time) | ||
sunny_thought = ChatMessage( | ||
content="API Error when connecting to sunny.org 💥", | ||
metadata={"title": "Checking sunny.org", "parent_id": 1, "id": 3, "status": "pending"}, | ||
) | ||
|
||
responses.append(sunny_thought) | ||
yield responses | ||
|
||
time.sleep(sleep_time) | ||
responses.append( | ||
ChatMessage( | ||
content="Failed again", | ||
metadata={"title": "I will try again", "id": 4, "parent_id": 3, "duration": 0.1}, | ||
|
||
) | ||
) | ||
sunny_thought.metadata["status"] = "done" | ||
sunny_thought.metadata["duration"] = time.time() - sunny_start_time | ||
|
||
main_thought.metadata["status"] = "done" | ||
main_thought.metadata["duration"] = time.time() - start_time | ||
|
||
yield responses | ||
|
||
time.sleep(long_sleep_time) | ||
|
||
responses.append( | ||
ChatMessage( | ||
content="Based on the data only from weather.com, the current weather in San Francisco is 60 degrees and sunny.", | ||
) | ||
) | ||
yield responses | ||
|
||
demo = gr.ChatInterface( | ||
generate_response, | ||
type="messages", | ||
title="Nested Thoughts Chat Interface", | ||
examples=["What is the weather in San Francisco right now?"] | ||
) | ||
|
||
if __name__ == "__main__": | ||
demo.launch() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1 @@ | ||
{"cells": [{"cell_type": "markdown", "id": "302934307671667531413257853548643485645", "metadata": {}, "source": ["# Gradio Demo: chatinterface_options"]}, {"cell_type": "code", "execution_count": null, "id": "272996653310673477252411125948039410165", "metadata": {}, "outputs": [], "source": ["!pip install -q gradio "]}, {"cell_type": "code", "execution_count": null, "id": "288918539441861185822528903084949547379", "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "import random\n", "\n", "example_code = \"\"\"\n", "Here's an example Python lambda function:\n", "\n", "lambda x: x + {}\n", "\n", "Is this correct?\n", "\"\"\"\n", "\n", "def chat(message, history):\n", " if message == \"Yes, that's correct.\":\n", " return \"Great!\"\n", " else:\n", " return {\n", " \"role\": \"assistant\",\n", " \"content\": example_code.format(random.randint(1, 100)),\n", " \"options\": [\n", " {\"value\": \"Yes, that's correct.\", \"label\": \"Yes\"},\n", " {\"value\": \"No\"}\n", " ]\n", " }\n", "\n", "demo = gr.ChatInterface(\n", " chat,\n", " type=\"messages\",\n", " examples=[\"Write an example Python lambda function.\"]\n", ")\n", "\n", "if __name__ == \"__main__\":\n", " demo.launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5} | ||
{"cells": [{"cell_type": "markdown", "id": "302934307671667531413257853548643485645", "metadata": {}, "source": ["# Gradio Demo: chatinterface_options"]}, {"cell_type": "code", "execution_count": null, "id": "272996653310673477252411125948039410165", "metadata": {}, "outputs": [], "source": ["!pip install -q gradio "]}, {"cell_type": "code", "execution_count": null, "id": "288918539441861185822528903084949547379", "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "import random\n", "\n", "example_code = \"\"\"\n", "Here's an example Python lambda function:\n", "\n", "lambda x: x + {}\n", "\n", "Is this correct?\n", "\"\"\"\n", "\n", "def chat(message, history):\n", " if message == \"Yes, that's correct.\":\n", " return \"Great!\"\n", " else:\n", " return gr.ChatMessage(\n", " content=example_code.format(random.randint(1, 100)),\n", " options=[\n", " {\"value\": \"Yes, that's correct.\", \"label\": \"Yes\"},\n", " {\"value\": \"No\"}\n", " ]\n", " )\n", "\n", "demo = gr.ChatInterface(\n", " chat,\n", " type=\"messages\",\n", " examples=[\"Write an example Python lambda function.\"]\n", ")\n", "\n", "if __name__ == \"__main__\":\n", " demo.launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
{"cells": [{"cell_type": "markdown", "id": "302934307671667531413257853548643485645", "metadata": {}, "source": ["# Gradio Demo: chatinterface_thoughts"]}, {"cell_type": "code", "execution_count": null, "id": "272996653310673477252411125948039410165", "metadata": {}, "outputs": [], "source": ["!pip install -q gradio "]}, {"cell_type": "code", "execution_count": null, "id": "288918539441861185822528903084949547379", "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "from gradio import ChatMessage\n", "import time\n", "\n", "sleep_time = 0.5\n", "\n", "def simulate_thinking_chat(message, history):\n", " start_time = time.time()\n", " response = ChatMessage(\n", " content=\"\",\n", " metadata={\"title\": \"_Thinking_ step-by-step\", \"id\": 0, \"status\": \"pending\"}\n", " )\n", " yield response\n", "\n", " thoughts = [\n", " \"First, I need to understand the core aspects of the query...\",\n", " \"Now, considering the broader context and implications...\",\n", " \"Analyzing potential approaches to formulate a comprehensive answer...\",\n", " \"Finally, structuring the response for clarity and completeness...\"\n", " ]\n", "\n", " accumulated_thoughts = \"\"\n", " for thought in thoughts:\n", " time.sleep(sleep_time)\n", " accumulated_thoughts += f\"- {thought}\\n\\n\"\n", " response.content = accumulated_thoughts.strip()\n", " yield response\n", "\n", " response.metadata[\"status\"] = \"done\"\n", " response.metadata[\"duration\"] = time.time() - start_time\n", " yield response\n", "\n", " response = [\n", " response,\n", " ChatMessage(\n", " content=\"Based on my thoughts and analysis above, my response is: This dummy repro shows how thoughts of a thinking LLM can be progressively shown before providing its final answer.\"\n", " )\n", " ]\n", " yield response\n", "\n", "\n", "demo = gr.ChatInterface(\n", " simulate_thinking_chat,\n", " title=\"Thinking LLM Chat Interface \ud83e\udd14\",\n", " type=\"messages\",\n", ")\n", "\n", "if __name__ == \"__main__\":\n", " demo.launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,49 @@ | ||
import gradio as gr | ||
from gradio import ChatMessage | ||
import time | ||
|
||
sleep_time = 0.5 | ||
|
||
def simulate_thinking_chat(message, history): | ||
start_time = time.time() | ||
response = ChatMessage( | ||
content="", | ||
metadata={"title": "_Thinking_ step-by-step", "id": 0, "status": "pending"} | ||
) | ||
yield response | ||
|
||
thoughts = [ | ||
"First, I need to understand the core aspects of the query...", | ||
"Now, considering the broader context and implications...", | ||
"Analyzing potential approaches to formulate a comprehensive answer...", | ||
"Finally, structuring the response for clarity and completeness..." | ||
] | ||
|
||
accumulated_thoughts = "" | ||
for thought in thoughts: | ||
time.sleep(sleep_time) | ||
accumulated_thoughts += f"- {thought}\n\n" | ||
response.content = accumulated_thoughts.strip() | ||
yield response | ||
|
||
response.metadata["status"] = "done" | ||
response.metadata["duration"] = time.time() - start_time | ||
yield response | ||
|
||
response = [ | ||
response, | ||
ChatMessage( | ||
content="Based on my thoughts and analysis above, my response is: This dummy repro shows how thoughts of a thinking LLM can be progressively shown before providing its final answer." | ||
) | ||
] | ||
yield response | ||
|
||
|
||
demo = gr.ChatInterface( | ||
simulate_thinking_chat, | ||
title="Thinking LLM Chat Interface 🤔", | ||
type="messages", | ||
) | ||
|
||
if __name__ == "__main__": | ||
demo.launch() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -37,6 +37,8 @@ class MetadataDict(TypedDict): | |
title: Union[str, None] | ||
id: NotRequired[int | str] | ||
parent_id: NotRequired[int | str] | ||
duration: NotRequired[float] | ||
status: NotRequired[Literal["pending", "done"]] | ||
|
||
|
||
class Option(TypedDict): | ||
|
@@ -59,7 +61,6 @@ class MessageDict(TypedDict): | |
role: Literal["user", "assistant", "system"] | ||
metadata: NotRequired[MetadataDict] | ||
options: NotRequired[list[Option]] | ||
duration: NotRequired[int] | ||
|
||
|
||
class FileMessage(GradioModel): | ||
|
@@ -87,14 +88,21 @@ class Metadata(GradioModel): | |
title: Optional[str] = None | ||
id: Optional[int | str] = None | ||
parent_id: Optional[int | str] = None | ||
duration: Optional[float] = None | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. should it be There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. it can be a float in the general case, I'll update the other reference! |
||
status: Optional[Literal["pending", "done"]] = None | ||
|
||
def __setitem__(self, key: str, value: Any) -> None: | ||
setattr(self, key, value) | ||
|
||
def __getitem__(self, key: str) -> Any: | ||
return getattr(self, key) | ||
|
||
|
||
class Message(GradioModel): | ||
role: str | ||
metadata: Metadata = Field(default_factory=Metadata) | ||
content: Union[str, FileMessage, ComponentMessage] | ||
options: Optional[list[Option]] = None | ||
duration: Optional[int] = None | ||
|
||
|
||
class ExampleMessage(TypedDict): | ||
|
@@ -110,13 +118,22 @@ class ExampleMessage(TypedDict): | |
] # list of file paths or URLs to be added to chatbot when example is clicked | ||
|
||
|
||
@document() | ||
@dataclass | ||
class ChatMessage: | ||
role: Literal["user", "assistant", "system"] | ||
""" | ||
A dataclass to represent a message in the Chatbot component (type="messages"). | ||
Parameters: | ||
content: The content of the message. Can be a string or a Gradio component. | ||
role: The role of the message, which determines the alignment of the message in the chatbot. Can be "user", "assistant", or "system". Defaults to "assistant". | ||
metadata: The metadata of the message, which is used to display intermediate thoughts / tool usage. Should be a dictionary with the following keys: "title" (required to display the thought), and optionally: "id" and "parent_id" (to nest thoughts), "duration" (to display the duration of the thought), "status" (to display the status of the thought). | ||
options: The options of the message. A list of Option objects, which are dictionaries with the following keys: "label" (the text to display in the option), and optionally "value" (the value to return when the option is selected if different from the label). | ||
""" | ||
|
||
content: str | FileData | Component | FileDataDict | tuple | list | ||
role: Literal["user", "assistant", "system"] = "assistant" | ||
metadata: MetadataDict | Metadata = field(default_factory=Metadata) | ||
options: Optional[list[Option]] = None | ||
duration: Optional[int] = None | ||
|
||
|
||
class ChatbotDataMessages(GradioRootModel): | ||
|
@@ -545,7 +562,6 @@ def _postprocess_message_messages( | |
content=message.content, # type: ignore | ||
metadata=message.metadata, # type: ignore | ||
options=message.options, | ||
duration=message.duration, | ||
) | ||
elif isinstance(message, Message): | ||
return message | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that I moved
duration
inside ofmetadata
where I think it makes more sense. I also added astatus
for more control over the loading spinner, as it improves the UI a bit (cc @yvrjsharma)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep that makes sense!