From b367ad5b7705e5f5e6508c22a78c8c2a6a35bdb8 Mon Sep 17 00:00:00 2001
From: carocao-msft <96077406+carocao-msft@users.noreply.github.com>
Date: Tue, 4 Feb 2025 02:02:55 +0000
Subject: [PATCH 1/5] update storybook
---
.../stories/Concepts/ClosedCaptions/Doc.mdx | 36 ++++++++++-------
.../Concepts/InCallNotifications/Docs.mdx | 26 ++++++++++++
.../stories/Concepts/MediaAccess/Doc.mdx | 40 +++++++++++++++++--
.../stories/Concepts/PowerPointLive/Doc.mdx | 28 +++++++++++++
.../stories/Concepts/RaiseHands/Doc.mdx | 17 ++++++++
.../stories/Concepts/RealTimeText/Docs.mdx | 33 ++++++++++++++-
.../stories/Concepts/Survey/Doc.mdx | 32 ++++++++++++---
.../stories/Concepts/TogetherMode/Docs.mdx | 35 +++++++++++++---
8 files changed, 216 insertions(+), 31 deletions(-)
create mode 100644 packages/storybook8/stories/Concepts/InCallNotifications/Docs.mdx
create mode 100644 packages/storybook8/stories/Concepts/RaiseHands/Doc.mdx
diff --git a/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx b/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx
index c308576baca..3dde5159c3c 100644
--- a/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx
+++ b/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx
@@ -11,31 +11,39 @@ import {
# Closed Captions
-Azure Communication Services UI Library is adding support for Closed Captions. Closed captions is a powerful tool that enables developers to enhance the accessibility of their videos. With closed captions, developers can provide a textual representation of the audio content in their videos, making it easier for users who are deaf or hard of hearing to follow along.
+Azure Communication Services UI Library is adding support for Closed Captions. Closed captions are a powerful tool that enables developers to enhance the accessibility of their videos. With closed captions, developers can provide a textual representation of the audio content in their videos, making it easier for users who are deaf or hard of hearing to follow along. Here are main scenarios where Closed Captions are useful:
-Closed Captions is supported by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
+Accessibility: In the workplace or consumer apps, Closed Captioning for meetings, conference calls, and training videos can make a dramatic difference. Scenarios when audio cannot be heard, either because of a noisy environment, such as an airport, or because of an environment that must be kept quiet, such as a hospital.
-Closed Captions is also supported for component users.
+Inclusivity: Closed Captioning was developed to aid hearing-impaired people but can also be useful in language proficiency as well.
-List of components exported for Closed Captions:
+# Incorporating Closed Captions in your experience:
-[StartCaptionsButton](./?path=/docs/components-start-captions-button--docs) is a component that can be used to turn on captions. Developers and use 'usePropsFor' to gather all the information required to power this component.
+The UI Library helps Contoso become more accessible and inclusive by providing closed captions within the composite and component experiences.
-[CaptionsSettingsModal](./?path=/docs/components-captions-settings-modal--docs) is a modal pop up that allows users to select spoken/caption language. Developers and use 'usePropsFor' to gather all the information required to power this component.
+## Interop Closed Captions:
-[CaptionsBanner](./?path=/docs/components-captions-banner--docs) is a component that combines and displays captions and Real-Time Text in one banner. Developers and use 'usePropsFor' to gather all the information required to power this component, including the list of Real-Time Text and captions messages in the call. User can also use this component for captions only or Real-Time Text only.
+Interop Closed Captions are enabled by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
+Captions can be enabled both in Mobile Web sessions and in Desktop Web sessions.
+For interop captions, users can enable captions in the menu and select the spoken language for the captions.
+Captions does not detect language automatically, so the spoken language selected needs to match the language that will be used in the call.
+Translation is also supported in this scenario.
-## Azure Communication Service Based Captions
+## ACS Based Captions:
-Azure Communication Service Closed Captions are supported by default and are automatically included within the CallComposite and CallWithChatComposite experiences for calling scenarios involving Azure Communication Service users only. Captions can be enabled in both Mobile Web sessions and in Desktop Web sessions.
+ACS Closed Captions are enabled by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
+Captions can be enabled both in Mobile Web sessions and in Desktop Web sessions.
+Currently, ACS captions does not support changing caption language or translation. Only changing the spoken language is supported at this time.
-For Azure Communication Service captions, users can enable captions in the menu and select the spoken language for the captions. Captions does not detect language automatically, so the spoken language selected needs to match the language that will be used in the call. Currently, Azure Communication Service captions does not support translation.
+## Closed Captions using Components
-## Teams Interop Closed Captions
+For component users, we export the following components to help you integrate captions in your own calling experiences.
-Teams Interop Closed Captions is supported by default and are automatically included within the CallComposite and CallWithChatComposite experiences during a call including one or more teams users.
+[StartCaptionsButton](./?path=/docs/components-start-captions-button--docs) is a component that can be used to turn on captions. Developers and use 'usePropsFor' to gather all the information required to power this component.
-The main difference between Azure Communication Service Closed Captions and Teams Interop Closed Captions is that Teams Interop Closed Captions supports translation. End users can choose to have captions translated to a different language by using captions settings.
+[CaptionsSettingsModal](./?path=/docs/components-captions-settings-modal--docs) is a modal pop up that allows users to select spoken/caption language. Developers and use 'usePropsFor' to gather all the information required to power this component.
+
+[CaptionsBanner](./?path=/docs/components-captions-banner--docs) is a component that combines and displays captions and Real-Time Text in one banner. Developers and use 'usePropsFor' to gather all the information required to power this component, including the list of Real-Time Text and captions messages in the call. User can also use this component for captions only or Real-Time Text only.
## How to use Captions
@@ -81,7 +89,7 @@ The caption language (Teams Interop Closed Captions) is set to English by defaul
diff --git a/packages/storybook8/stories/Concepts/InCallNotifications/Docs.mdx b/packages/storybook8/stories/Concepts/InCallNotifications/Docs.mdx
new file mode 100644
index 00000000000..c8c7760c086
--- /dev/null
+++ b/packages/storybook8/stories/Concepts/InCallNotifications/Docs.mdx
@@ -0,0 +1,26 @@
+import { Meta } from '@storybook/addon-docs';
+
+
+
+# In Call Notifications
+
+Azure Communication Services UI Library is adding support for improved notifications
+
+In call notifications are essential for providing users with timely and relevant information about their calling experience.
+Whether it is an error message, a mute status, or a network quality indicator, notifications can help users troubleshoot issues and improve their communication.
+The new feature of ACS UI Library simplifies the display and management of multiple notifications in a consistent and user-friendly way.
+The in-call notification feature introduces a streamlined UI experience for displaying errors and notifications in the calling environment.
+
+## Incorporating In Call Notifications into your Experience
+
+The UI Library enables users to enhance their video conferencing experiences by providing improved notifications by default within the CallComposite and CallWithChat experiences.
+
+There are also two Components that have been exposed for this feature:
+
+[Notification](./?path=/docs/components-notification--doc)
+
+Notification is a container that shows a notification in a bar format with an icon, title, message, and button. The icon and title are necessary while a message and button are optional. See the example below:
+
+[NotificationStack](./?path=/docs/components-notificationstack--doc)
+
+NotificationStack is a wrapper on the Notification component with additional features for surfacing Azure Communication Services notifications on the UI consistently.
diff --git a/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx b/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx
index 601df720fd4..79fb1bef9c1 100644
--- a/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx
+++ b/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx
@@ -8,12 +8,17 @@ import MediaAccessRemoteParticipantsText from '!!raw-loader!./snippets/MediaAcce
# Media access
-The media access feature in Teams meetings allows the Organizer, Co-organizer, and Presenter to control whether attendees can enable their mic or camera.
-This can be managed through the Teams meeting options “Allow mic/camera for attendees” or on a per-participant basis with the options “Disable mic/camera” and “Enable mic/camera.”
+## Overview
-Teams meeting attendees can check their own media access state using the capabilities `unMuteMic` and `turnVideoOn`, or view the media states for remote participants.
+Azure Communication Services UI Library is adding support for media access.
+This feature allows organizers, co-organizers, and presenters to control the ability of other attendees to send audio and video.
+Additionally, users can determine if their audio or video is enabled or disabled and check the media access status of other participants.
+This feature allows greater control over meeting environments, minimizing disruptions from unexpected background noise or a disruptive participant.
+Here are some example scenarios where hard mute and video mute are useful:
-ACS users must have the Organizer, Co-organizer, or Presenter role to use the media access feature.
+During a large virtual visit with multiple doctors, the host can use hard mute to ensure that only the speaker is heard, reducing interruptions.
+
+In a virtual banking session, the host can video mute participants to maintain focus on the presentation, ensuring no distractions from video feeds.
The supported scenarios for the media access feature are:
@@ -21,6 +26,15 @@ The supported scenarios for the media access feature are:
- Teams Interop Meetings as a Teams user
- Teams ad-hoc call
+## How to use Media Access Feature
+
+The media access feature in Teams meetings allows the Organizer, Co-organizer, and Presenter to control whether attendees can enable their mic or camera.
+This can be managed through the Teams meeting options “Allow mic/camera for attendees” or on a per-participant basis with the options “Disable mic/camera” and “Enable mic/camera.”
+
+Teams meeting attendees can check their own media access state using the capabilities `unMuteMic` and `turnVideoOn`, or view the media states for remote participants.
+
+ACS users must have the Organizer, Co-organizer, or Presenter role to use the media access feature.
+
Participants can disable/enable audio/video using the contextual menu button on their video gallery tile like shown below:
+
+## FAQ:
+
+### When recording a call or meeting, will the hard Mute or video mute status be included in the recording?
+
+The recording bot subscribes to the overall audio and video streams, so the muted status will not impact the recording as it is based on the live state during the meeting.
+
+### What happens when a user is hard muted or video-muted?
+
+When a user is hard muted, they receive a notification informing them of their muted status. Similarly, when a user is video muted, they receive a notification. Icons indicating the mute status appear next to their name in the participant list and video gallery tile.
+
+### Can users see who has been muted?
+
+Yes, users can see muted icons next to the names of participants who have been hard muted, or video muted in the participant list and video gallery tile.
diff --git a/packages/storybook8/stories/Concepts/PowerPointLive/Doc.mdx b/packages/storybook8/stories/Concepts/PowerPointLive/Doc.mdx
index 8905cdc9da5..594613bc2f2 100644
--- a/packages/storybook8/stories/Concepts/PowerPointLive/Doc.mdx
+++ b/packages/storybook8/stories/Concepts/PowerPointLive/Doc.mdx
@@ -10,6 +10,12 @@ Azure Communication Services (ACS) integrate PowerPoint Live into Teams meetings
## Overview
+Azure Communication Services UI Library is adding support for receiving PowerPoint Live video stream.
+
+PowerPoint Live in Teams gives both the presenter and audience an inclusive and engaging experience, combining the best parts of presenting in PowerPoint with the connection and collaboration of a Microsoft Teams meeting.
+ACS UI Library is now able to receive such streams, allowing for better Spresentations and more personalized experiences.
+Enabling this feature will enable greater productivity and allow for a better experience for ACS based participants in interop calls.
+
In the ACS UI library, users participate as attendees to view PowerPoint Live sessions. The interaction capabilities for attendees are set and controlled by the lead presenter of the session.
For a detailed breakdown of the capabilities available to each role during a PowerPoint Live session, please refer to the [PowerPoint Live documentation](https://support.microsoft.com/en-us/office/present-from-powerpoint-live-in-microsoft-teams-28b20e74-7165-499c-9bd4-0ad975d448ad). For more technical details on how to implement these features using the web calling SDK, consult the [Calling SDK documentation](https://learn.microsoft.com/en-us/azure/communication-services/concepts/voice-video-calling/calling-sdk-features).
@@ -27,6 +33,16 @@ Some of the best use cases for PowerPoint Live are:
Figure: PowerPoint Live interface comparison. Left: Teams app. Right: ACS UI library.
+## Incorporating PowerPoint Live into your experience:
+
+The UI Library enables users to enhance their video conferencing experiences by providing the receive PowerPoint live video stream by default within the CallComposite and CallWithChat experiences.
+
+Important notes:
+
+Currently, this feature is only supports receiving a non interactive PowerPoint Live feed. ACS users will not be able to interact with the video stream in the same manner a user on the Teams client will able to, nor will ACS users be able to send a PowerPoint Live feed.
+
+This feature is only available for ACS/Teams Interop calls
+
## Implementing with the VideoGallery
The VideoGallery component is used to display PowerPoint Live sessions. This multi-functional component also supports video streams and screen sharing. Here’s a brief guide to integrating it:
@@ -50,3 +66,15 @@ No, our support is limited to Teams meetings where the presenter can share the P
### If I have found any issues or anything missing with the PowerPoint live in the CallComposite where can I go?
Please log a [Github issue](https://github.com/Azure/communication-ui-library/issues), and someone from our team can work with you.
+
+## Tips to Investigate cases:
+
+- For pptlive initialization: Web Calling SDK has telemetry with initialize, success, failure.
+
+- ACS takes this first responsibility when the issue is raised.
+
+- Use callId to Locate the Call: Begin the debugging process by identifying the call with issues, using the callId. This ID is your entry point for tracing the problem.
+
+- Examine contentId for Specific Content Issues: Once the call is located, look for the contentId. The contentId is a unique identifier for a sharing session within that call, pinpointing the content that is causing problems.
+
+- Check featureDetails_step for Detailed Insights: Investigate the featureDetails_step to uncover detailed information about the steps taken during the PPTLive session. This can include both the operational steps Contoso performed and any error steps that occurred.
diff --git a/packages/storybook8/stories/Concepts/RaiseHands/Doc.mdx b/packages/storybook8/stories/Concepts/RaiseHands/Doc.mdx
new file mode 100644
index 00000000000..b4c72213d4d
--- /dev/null
+++ b/packages/storybook8/stories/Concepts/RaiseHands/Doc.mdx
@@ -0,0 +1,17 @@
+import { Meta } from '@storybook/addon-docs';
+
+
+
+# Raise Hands
+
+Azure Communication Services UI Library is adding support for Raised Hands.
+
+With this new feature, users can now raise and lower their hand during a meeting or call. The user can also see the raised hands of other users to know if someone else is looking to speak. Raised hands are displayed prominently in the video gallery but can also be seen on the participant list to be informed of everyone raising their hands on a call. Enabling this feature will enable greater productivity and allow for a better flow of discussions and conversations within calls. Here are some example scenarios where raised hands are useful:
+
+During a meeting with many participants, the raised hands feature can help the host to manage the conversation and ensure that everyone has a chance to speak. Participants can raise their hands to indicate that they would like to speak, and the host can call on them in order.
+
+In a call setting, the raised hands feature can be used to signal that an action has been completed by the members of a call. For example, participants can raise their hands to signal that they have completed reading a document that has been shared. This lets the host know they can move on with the meeting.
+
+## Incorporating Raise Hands into your Experience
+
+The UI Library enables users to enhance their video conferencing experiences by providing the raised hands feature by default within the CallComposite and CallWithChat experiences.
diff --git a/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx b/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
index 194cbb4e826..d5071c0da74 100644
--- a/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
+++ b/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
@@ -7,10 +7,25 @@ import { overviewPageImagesStackStyle } from '../../constants';
# Real-Time Text
+## Overview
+
-Azure Communication Services UI Library supports Real-Time Text. Real-Time Text is a powerful tool that enables developers to enhance the accessibility of their videos.
-Real-Time Text (RTT) is a new accessibility feature that enables live transmission of text as it is typed, enhancing communication during voice and video calls. RTT displays text instantly, character by character, creating a natural and dynamic communication experience that mirrors spoken conversation.
+Real-Time Text (RTT) is a feature that allows text to be sent and received instantly during voice and video calls. As you type, the text appears immediately, letter by letter, making the conversation feel more natural and dynamic, just like talking.
+
+For example, if an RTT user types "Hello, how are you?" during a call, the recipient sees each character as it is typed: "H," then "He," then "Hel," and so on. This immediacy allows for fluid, continuous exchanges, benefiting users with hearing or speech impairments and improving clarity in noisy or quiet environments.
+
+## Key Scenarios Where RTT is Useful:
+
+Accessibility: RTT empowers individuals with speech or hearing impairments to participate in conversations actively, ensuring their input is received as naturally as spoken words.
+
+Enhancing Clarity: In noisy environments or situations with audio quality issues, RTT provides a reliable text-based alternative.
+
+## Why RTT Matters
+
+Microsoft is dedicated to accessibility, and the incorporation of Real-Time Text (RTT) supports this commitment by adhering to accessibility standards such as The European Accessibility Act (Directive (EU) 2019/882. This directive requires that voice and video services support RTT by June 2025, ensuring inclusive communication throughout Europe. Voice and video services will not be permitted to operate in Europe or have Europe based customers after June 2025 without RTT, making the inclusion of this feature in the UI Library critically important.
+
+## Incorporating RTT in Your Experience
Real-Time Text is supported by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
@@ -60,3 +75,17 @@ Note that Real-Time Text will be enabled for all participants in the call once t
/>
+
+## RTT is available in these scenarios
+
+| Call Type | Supported | Notes |
+| --------------------- | --------- | ---------------------------------------- |
+| Roomes | Yes |
+| 1:1/1:n | Yes |
+| Teams Meeting Interop | No | Supported once RTT is available in Teams |
+| Teams Adhoc calling | No | Supported once RTT is available in Teams |
+| Group Calls | Yes |
+| PSTN | No | Timeline for support TBD |
+
+To read more details about the underlying infrastructure, click here:
+[Real Time Text (RTT) Overview - An Azure Communication Services concept document | Microsoft Learn](https://learn.microsoft.com/en-us/azure/communication-services/concepts/voice-video-calling/real-time-text)
diff --git a/packages/storybook8/stories/Concepts/Survey/Doc.mdx b/packages/storybook8/stories/Concepts/Survey/Doc.mdx
index 11c1eaee07d..a0416e6322a 100644
--- a/packages/storybook8/stories/Concepts/Survey/Doc.mdx
+++ b/packages/storybook8/stories/Concepts/Survey/Doc.mdx
@@ -19,15 +19,21 @@ Participants will be able to submit feedback based on four categories:
They can rate their call experience on a star based numerical survey and provide additional detail as to the specifics of each category if they wish. The feedback feature will enable developers to collect subjective customer feedback on call quality and reliability and enable the creation of more definite metrics.
-## Handling Survey Results
+With the ACS UI Library adding UI support for end of call survey, developers will be able to use a prebuilt UI to create a survey at the end of their calls to receive feedback and improve their audio and video quality.
-Survey feedback is automatically sent to Azure Monitor. To send survey results to your own service, you can also gain access to the survey results by passing in a custom function utilizing the `onSurveySubmitted` prop inside `surveyOptions`.
+## Use Cases
-With `onSurveySubmitted` populated, a free form text survey is available to the call users at end of call to gather more detailed feedback.
+Here are some example scenarios where the end of call survey is useful:
-- Note that results from the free form text survey will not be sent to Azure Monitor. Text results are accessible through the `improvementSuggestions` prop from `onSurveySubmitted` and need to be collected and handled by Contoso.
+Feedback on Technical Issues: Customers can share when they have issues such as poor audio or video quality, dropped calls, or lag. An end of call survey can provide customers with a way to report any technical issues they experienced during the call. This feedback can help companies identify and address technical issues that may be impacting the quality of their video calls.
-
+Overall feedback: a 5-star ranking system is useful in an end of call survey for video calls because it provides a simple and easy-to-use scale for customers to rate their experience. It allows for more nuanced feedback than a simple yes or no question and can be used to track customer satisfaction over time. It also allows for more feedback points that would be lost with only a longer survey.
+
+## Incorporating End of Call Survey into your experience:
+
+The UI Library enables users to enhance their video conferencing experiences by providing the end of Call Survey by default within the CallComposite and CallWithChatComposite experiences.
+
+The end of call survey is enabled by default in the composite, but developers can disable/enable the feature if they wish.
## Disabling End of Call Survey
@@ -35,6 +41,16 @@ The UI Library enables users to display surveys at end of call by providing the
+## Handling Survey Results
+
+Survey feedback is automatically sent to Azure Monitor. To send survey results to your own service, you can also gain access to the survey results by passing in a custom function utilizing the `onSurveySubmitted` prop inside `surveyOptions`.
+
+With `onSurveySubmitted` populated, a free form text survey is available to the call users at end of call to gather more detailed feedback.
+
+- Note that results from the free form text survey will not be sent to Azure Monitor. Text results are accessible through the `improvementSuggestions` prop from `onSurveySubmitted` and need to be collected and handled by Contoso.
+
+
+
## Redirect to your own experience after end of call survey
The UI Library enables users to redirect to their own experience when end of call survey is skipped, submitted, or has an issue sending. This is done by passing in a custom function utilizing the `onSurveyClosed` prop inside `surveyOptions`.
@@ -42,3 +58,9 @@ The UI Library enables users to redirect to their own experience when end of cal
- Note that by writing to this function, the default screens shown after survey is closed are overwritten. Users can choose to redirect to a specific screen after survey is closed, or redirect to different screens based on surveyState
+
+## Best Practices:
+
+- To best utilize the survey, we recommend only surveying a subset of Users.
+- When free form text survey is enabled, the free form text data collected is not sent to Azure monitoring. We suggest setting up your own service for handling free form text data.
+- To track your survey results in Azure monitoring, please follow this guidance [End of Call Survey Logs](https://learn.microsoft.com/en-us/azure/communication-services/concepts/analytics/logs/end-of-call-survey-logs).
diff --git a/packages/storybook8/stories/Concepts/TogetherMode/Docs.mdx b/packages/storybook8/stories/Concepts/TogetherMode/Docs.mdx
index d86dd9d2d8e..a0d1280c12b 100644
--- a/packages/storybook8/stories/Concepts/TogetherMode/Docs.mdx
+++ b/packages/storybook8/stories/Concepts/TogetherMode/Docs.mdx
@@ -6,18 +6,24 @@ import TogetherModeSnippetText from '!!raw-loader!./snippets/TogetherMode.snippe
# TogetherMode
-Together Mode in Teams uses AI to place everyone in a shared virtual background, creating the feeling of being in the same room.
-It’s built to make meetings more engaging and foster stronger connections.
+Azure Communication Services UI Library is adding support for Together Mode in Interop only scenarios.
-It’s especially effective for training sessions, workshops, or team-building activities.
-A classroom setting, like the one shown below, promotes learning and collaboration, helping participants stay engaged and comfortable in a familiar atmosphere.
+With the introduction of this new feature, ACS users can now participate in together mode during interop meetings and calls.
+This feature allows users to see all participants in a shared virtual background, fostering a sense of being in the same room.
+Together mode is prominently displayed in the video gallery enabling users to keep track of everyone engaged in the call.
+Enabling together mode enhances productivity and promotes a smoother flow of discussions and conversations within meetings.
+Here are some example scenarios where together mode is useful:
+
+- During a meeting with many participants, Together Mode can help the host to manage the conversation and ensure that everyone feels included. Participants can feel a greater sense of connection and collaboration as if they are physically present in the same room, even though they are remote.
+
+- In a call setting, Together Mode can be used to create a more engaging and interactive experience. For example, participants can see each other in a shared virtual environment, which can enhance communication. When a user begins to share a screen, while together mode is active, the view switches back to the default view until screensharing stops. It then switches back to together mode. This helps the host maintain engagement and move on with the meeting efficiently.
## Supported Scenarios
-Together Mode is available only in Teams meetings and group calls. It can be started exclusively by Teams users with the roles of Organizer, Co-organizer, or Presenter.
-ACS (Azure Communication Services) users cannot initiate Together Mode themselves, but they can view the Together Mode stream if it’s started by a Teams user.
+Together mode is supported exclusively in Teams interop meetings or group calls that include a Microsoft 365 user.
+It can only be initiated by a Microsoft 365 user who holds the role of organizer, co-organizer, or presenter. Users who do not fulfill this requirement can switch to the Together mode stream view only when it has been initiated by an eligible user.
The option to start or switch to Together Mode is disabled because ACS users do not have permission to initiate Together
@@ -27,6 +33,11 @@ Mode.
When a Teams user starts Together Mode, the view option becomes enabled, allowing ACS users to switch to the Together
Mode view.
+## Incorporating Together Mode into your experience:
+
+The UI library includes together mode as a feature by default within the CallComposite and CallWithChat Composite experiences, so no additional work is required.
+However, the Together mode view is only available for ACS users when on an interop call with a Teams or CTE user who has started together mode.
+
## Implementing with the VideoGallery
The VideoGallery component is used to display together mode stream. This multi-functional component also supports video streams and screen sharing.
@@ -51,6 +62,18 @@ For comprehensive setup instructions, refer to the [VideoGallery](./?path=/docs/
## FAQs
+### When recording a call or meeting, will the Together Mode view be included in the recording?
+
+The recording bot does not subscribe to the Together Mode stream. Therefore, when recording is initiated during a meeting or call, the video stream from Together Mode will not be included in the recording.
+
+### Does Together Mode handle signaling features like spotlight or raise hand?
+
+Yes. When together mode is active spotlight and raise hand will be handled by the together mode feature
+
+### Is Panorama view supported?
+
+No. In this version, Panorama View will not be supported in the UI.
+
### Can I use UI library to change TogetherMode Scene?
No, scene change can only be performed on Teams Desktop client.
From 4bd625661af436b63f8b8c2d569fc4431186c023 Mon Sep 17 00:00:00 2001
From: carocao-msft <96077406+carocao-msft@users.noreply.github.com>
Date: Tue, 4 Feb 2025 11:03:28 -0800
Subject: [PATCH 2/5] Update terminology from "hard mute" to "disable"
---
.../stories/Concepts/MediaAccess/Doc.mdx | 16 ++++++++--------
1 file changed, 8 insertions(+), 8 deletions(-)
diff --git a/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx b/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx
index 79fb1bef9c1..c2940c33d47 100644
--- a/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx
+++ b/packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx
@@ -14,11 +14,11 @@ Azure Communication Services UI Library is adding support for media access.
This feature allows organizers, co-organizers, and presenters to control the ability of other attendees to send audio and video.
Additionally, users can determine if their audio or video is enabled or disabled and check the media access status of other participants.
This feature allows greater control over meeting environments, minimizing disruptions from unexpected background noise or a disruptive participant.
-Here are some example scenarios where hard mute and video mute are useful:
+Here are some example scenarios where Disable mic and Disable camera are useful:
-During a large virtual visit with multiple doctors, the host can use hard mute to ensure that only the speaker is heard, reducing interruptions.
+During a large virtual visit with multiple doctors, the host can use Disable mic to ensure that only the speaker is heard, reducing interruptions.
-In a virtual banking session, the host can video mute participants to maintain focus on the presentation, ensuring no distractions from video feeds.
+In a virtual banking session, the host can disable participants' cameras to maintain focus on the presentation, ensuring no distractions from video feeds.
The supported scenarios for the media access feature are:
@@ -61,7 +61,7 @@ The concept of the media access feature is the same in Microsoft Teams which you
more about here -
[Manage attendee audio and video permissions in Microsoft Teams meetings](https://support.microsoft.com/en-us/office/manage-attendee-audio-and-video-permissions-in-microsoft-teams-meetings-f9db15e1-f46f-46da-95c6-34f9f39e671a).
-## Incorporating Hard Mute and Video Mute into your experience:
+## Incorporating Disable mic and Disable camera into your experience:
The UI library includes the media access by default within the CallComposite and CallWithChat Composite experiences, so no additional work is required.
@@ -96,14 +96,14 @@ for remote participants from an added dropdown that is populated by remote parti
## FAQ:
-### When recording a call or meeting, will the hard Mute or video mute status be included in the recording?
+### When recording a call or meeting, will the Disable mic or Disable camera status be included in the recording?
The recording bot subscribes to the overall audio and video streams, so the muted status will not impact the recording as it is based on the live state during the meeting.
-### What happens when a user is hard muted or video-muted?
+### What happens when a user's mic or video is disabled?
-When a user is hard muted, they receive a notification informing them of their muted status. Similarly, when a user is video muted, they receive a notification. Icons indicating the mute status appear next to their name in the participant list and video gallery tile.
+When a user's mic is disabled, they receive a notification informing them of their muted status. Similarly, when a user's camera is disabled, they receive a notification. Icons indicating the mute status appear next to their name in the participant list and video gallery tile.
### Can users see who has been muted?
-Yes, users can see muted icons next to the names of participants who have been hard muted, or video muted in the participant list and video gallery tile.
+Yes, users can see muted icons next to the names of participants whose mic or camera has been disabled in the participant list and video gallery tile.
From bede372bf035031a52726e2fe73be2ebd9bd2c47 Mon Sep 17 00:00:00 2001
From: carocao-msft <96077406+carocao-msft@users.noreply.github.com>
Date: Thu, 6 Feb 2025 11:28:04 -0800
Subject: [PATCH 3/5] Update ACS to Azure Communication Services
---
packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx b/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx
index 3dde5159c3c..f0128cd72e0 100644
--- a/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx
+++ b/packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx
@@ -29,11 +29,11 @@ For interop captions, users can enable captions in the menu and select the spoke
Captions does not detect language automatically, so the spoken language selected needs to match the language that will be used in the call.
Translation is also supported in this scenario.
-## ACS Based Captions:
+## Azure Communication Services Based Captions:
-ACS Closed Captions are enabled by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
+Azure Communication Services Closed Captions are enabled by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
Captions can be enabled both in Mobile Web sessions and in Desktop Web sessions.
-Currently, ACS captions does not support changing caption language or translation. Only changing the spoken language is supported at this time.
+Currently, Azure Communication Services captions does not support changing caption language or translation. Only changing the spoken language is supported at this time.
## Closed Captions using Components
From 33f45fda4ea41325beb1b6166d4d6949ef52b367 Mon Sep 17 00:00:00 2001
From: carocao-msft <96077406+carocao-msft@users.noreply.github.com>
Date: Fri, 7 Feb 2025 10:35:01 -0800
Subject: [PATCH 4/5] Fix typo in 'Rooms' table entry
---
packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx b/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
index d5071c0da74..1abb52bb4d5 100644
--- a/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
+++ b/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
@@ -80,7 +80,7 @@ Note that Real-Time Text will be enabled for all participants in the call once t
| Call Type | Supported | Notes |
| --------------------- | --------- | ---------------------------------------- |
-| Roomes | Yes |
+| Rooms | Yes |
| 1:1/1:n | Yes |
| Teams Meeting Interop | No | Supported once RTT is available in Teams |
| Teams Adhoc calling | No | Supported once RTT is available in Teams |
From 03a222ba453f9d9736ed5c3b3cf816b39a458e65 Mon Sep 17 00:00:00 2001
From: carocao-msft <96077406+carocao-msft@users.noreply.github.com>
Date: Fri, 7 Feb 2025 18:59:20 +0000
Subject: [PATCH 5/5] fix lint
---
packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx b/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
index 1abb52bb4d5..b5f413ccb36 100644
--- a/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
+++ b/packages/storybook8/stories/Concepts/RealTimeText/Docs.mdx
@@ -80,7 +80,7 @@ Note that Real-Time Text will be enabled for all participants in the call once t
| Call Type | Supported | Notes |
| --------------------- | --------- | ---------------------------------------- |
-| Rooms | Yes |
+| Rooms | Yes |
| 1:1/1:n | Yes |
| Teams Meeting Interop | No | Supported once RTT is available in Teams |
| Teams Adhoc calling | No | Supported once RTT is available in Teams |