Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update storybook concept pages to match their corresponding conceptual docs #5613

Merged
merged 6 commits into from
Feb 7, 2025
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 22 additions & 14 deletions packages/storybook8/stories/Concepts/ClosedCaptions/Doc.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,31 +11,39 @@ import {

# Closed Captions

Azure Communication Services UI Library is adding support for Closed Captions. Closed captions is a powerful tool that enables developers to enhance the accessibility of their videos. With closed captions, developers can provide a textual representation of the audio content in their videos, making it easier for users who are deaf or hard of hearing to follow along.
Azure Communication Services UI Library is adding support for Closed Captions. Closed captions are a powerful tool that enables developers to enhance the accessibility of their videos. With closed captions, developers can provide a textual representation of the audio content in their videos, making it easier for users who are deaf or hard of hearing to follow along. Here are main scenarios where Closed Captions are useful:

Closed Captions is supported by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
Accessibility: In the workplace or consumer apps, Closed Captioning for meetings, conference calls, and training videos can make a dramatic difference. Scenarios when audio cannot be heard, either because of a noisy environment, such as an airport, or because of an environment that must be kept quiet, such as a hospital.

Closed Captions is also supported for component users.
Inclusivity: Closed Captioning was developed to aid hearing-impaired people but can also be useful in language proficiency as well.

List of components exported for Closed Captions:
# Incorporating Closed Captions in your experience:

[StartCaptionsButton](./?path=/docs/components-start-captions-button--docs) is a component that can be used to turn on captions. Developers and use 'usePropsFor' to gather all the information required to power this component.
The UI Library helps Contoso become more accessible and inclusive by providing closed captions within the composite and component experiences.

[CaptionsSettingsModal](./?path=/docs/components-captions-settings-modal--docs) is a modal pop up that allows users to select spoken/caption language. Developers and use 'usePropsFor' to gather all the information required to power this component.
## Interop Closed Captions:

[CaptionsBanner](./?path=/docs/components-captions-banner--docs) is a component that combines and displays captions and Real-Time Text in one banner. Developers and use 'usePropsFor' to gather all the information required to power this component, including the list of Real-Time Text and captions messages in the call. User can also use this component for captions only or Real-Time Text only.
Interop Closed Captions are enabled by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
Captions can be enabled both in Mobile Web sessions and in Desktop Web sessions.
For interop captions, users can enable captions in the menu and select the spoken language for the captions.
Captions does not detect language automatically, so the spoken language selected needs to match the language that will be used in the call.
Translation is also supported in this scenario.

## Azure Communication Service Based Captions
## ACS Based Captions:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be Azure Communication Service still. we try to avoid using the acronym in the documentation

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated


Azure Communication Service Closed Captions are supported by default and are automatically included within the CallComposite and CallWithChatComposite experiences for calling scenarios involving Azure Communication Service users only. Captions can be enabled in both Mobile Web sessions and in Desktop Web sessions.
ACS Closed Captions are enabled by default and are automatically included within the CallComposite and CallWithChatComposite experiences.
Captions can be enabled both in Mobile Web sessions and in Desktop Web sessions.
Currently, ACS captions does not support changing caption language or translation. Only changing the spoken language is supported at this time.

For Azure Communication Service captions, users can enable captions in the menu and select the spoken language for the captions. Captions does not detect language automatically, so the spoken language selected needs to match the language that will be used in the call. Currently, Azure Communication Service captions does not support translation.
## Closed Captions using Components

## Teams Interop Closed Captions
For component users, we export the following components to help you integrate captions in your own calling experiences.

Teams Interop Closed Captions is supported by default and are automatically included within the CallComposite and CallWithChatComposite experiences during a call including one or more teams users.
[StartCaptionsButton](./?path=/docs/components-start-captions-button--docs) is a component that can be used to turn on captions. Developers and use 'usePropsFor' to gather all the information required to power this component.

The main difference between Azure Communication Service Closed Captions and Teams Interop Closed Captions is that Teams Interop Closed Captions supports translation. End users can choose to have captions translated to a different language by using captions settings.
[CaptionsSettingsModal](./?path=/docs/components-captions-settings-modal--docs) is a modal pop up that allows users to select spoken/caption language. Developers and use 'usePropsFor' to gather all the information required to power this component.

[CaptionsBanner](./?path=/docs/components-captions-banner--docs) is a component that combines and displays captions and Real-Time Text in one banner. Developers and use 'usePropsFor' to gather all the information required to power this component, including the list of Real-Time Text and captions messages in the call. User can also use this component for captions only or Real-Time Text only.

## How to use Captions

Expand Down Expand Up @@ -81,7 +89,7 @@ The caption language (Teams Interop Closed Captions) is set to English by defaul
<PrimaryButton
style={{ width: 'fit-content', color: 'white' }}
text="Go to CallComposite to see captions in action"
href="../?path=/story/composites-callcomposite--basic-example"
href="../?path=/story/composites-callcomposite-basic-example--basic-example"
/>
</Stack>

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import { Meta } from '@storybook/addon-docs';

<Meta title="Concepts/In Call Notifications" />

# In Call Notifications

Azure Communication Services UI Library is adding support for improved notifications

In call notifications are essential for providing users with timely and relevant information about their calling experience.
Whether it is an error message, a mute status, or a network quality indicator, notifications can help users troubleshoot issues and improve their communication.
The new feature of ACS UI Library simplifies the display and management of multiple notifications in a consistent and user-friendly way.
The in-call notification feature introduces a streamlined UI experience for displaying errors and notifications in the calling environment.

## Incorporating In Call Notifications into your Experience

The UI Library enables users to enhance their video conferencing experiences by providing improved notifications by default within the CallComposite and CallWithChat experiences.

There are also two Components that have been exposed for this feature:

[Notification](./?path=/docs/components-notification--doc)

Notification is a container that shows a notification in a bar format with an icon, title, message, and button. The icon and title are necessary while a message and button are optional. See the example below:

[NotificationStack](./?path=/docs/components-notificationstack--doc)

NotificationStack is a wrapper on the Notification component with additional features for surfacing Azure Communication Services notifications on the UI consistently.
40 changes: 36 additions & 4 deletions packages/storybook8/stories/Concepts/MediaAccess/Doc.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,19 +8,33 @@ import MediaAccessRemoteParticipantsText from '!!raw-loader!./snippets/MediaAcce

# Media access

The media access feature in Teams meetings allows the Organizer, Co-organizer, and Presenter to control whether attendees can enable their mic or camera.
This can be managed through the Teams meeting options “Allow mic/camera for attendees” or on a per-participant basis with the options “Disable mic/camera” and “Enable mic/camera.”
## Overview

Teams meeting attendees can check their own media access state using the capabilities `unMuteMic` and `turnVideoOn`, or view the media states for remote participants.
Azure Communication Services UI Library is adding support for media access.
This feature allows organizers, co-organizers, and presenters to control the ability of other attendees to send audio and video.
Additionally, users can determine if their audio or video is enabled or disabled and check the media access status of other participants.
This feature allows greater control over meeting environments, minimizing disruptions from unexpected background noise or a disruptive participant.
Here are some example scenarios where hard mute and video mute are useful:

ACS users must have the Organizer, Co-organizer, or Presenter role to use the media access feature.
During a large virtual visit with multiple doctors, the host can use hard mute to ensure that only the speaker is heard, reducing interruptions.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I heard from PM that hard mute is internal word. From the UI menu we have "Disable mic".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated


In a virtual banking session, the host can video mute participants to maintain focus on the presentation, ensuring no distractions from video feeds.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

video mute? From UI menu we have "Disable camera".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated


The supported scenarios for the media access feature are:

- Teams Interop Meetings
- Teams Interop Meetings as a Teams user
- Teams ad-hoc call

## How to use Media Access Feature

The media access feature in Teams meetings allows the Organizer, Co-organizer, and Presenter to control whether attendees can enable their mic or camera.
This can be managed through the Teams meeting options “Allow mic/camera for attendees” or on a per-participant basis with the options “Disable mic/camera” and “Enable mic/camera.”

Teams meeting attendees can check their own media access state using the capabilities `unMuteMic` and `turnVideoOn`, or view the media states for remote participants.

ACS users must have the Organizer, Co-organizer, or Presenter role to use the media access feature.

Participants can disable/enable audio/video using the contextual menu button on their video gallery tile like shown below:

<img
Expand All @@ -47,6 +61,10 @@ The concept of the media access feature is the same in Microsoft Teams which you
more about here -
[Manage attendee audio and video permissions in Microsoft Teams meetings](https://support.microsoft.com/en-us/office/manage-attendee-audio-and-video-permissions-in-microsoft-teams-meetings-f9db15e1-f46f-46da-95c6-34f9f39e671a).

## Incorporating Hard Mute and Video Mute into your experience:

The UI library includes the media access by default within the CallComposite and CallWithChat Composite experiences, so no additional work is required.

## Listening to local participant `unmuteMic` and `turnVideoOn` capabilities changes

You can listen to `capabilitiesChanged` events on the CallAdapter or CallWithChatAdapter by defining your own
Expand Down Expand Up @@ -75,3 +93,17 @@ The example below shows a code snippet where a button is added to invoke the `fo
for remote participants from an added dropdown that is populated by remote participants in the call.

<Source code={CustomMediaAccessCompositeText} />

## FAQ:

### When recording a call or meeting, will the hard Mute or video mute status be included in the recording?

The recording bot subscribes to the overall audio and video streams, so the muted status will not impact the recording as it is based on the live state during the meeting.

### What happens when a user is hard muted or video-muted?

When a user is hard muted, they receive a notification informing them of their muted status. Similarly, when a user is video muted, they receive a notification. Icons indicating the mute status appear next to their name in the participant list and video gallery tile.

### Can users see who has been muted?

Yes, users can see muted icons next to the names of participants who have been hard muted, or video muted in the participant list and video gallery tile.
28 changes: 28 additions & 0 deletions packages/storybook8/stories/Concepts/PowerPointLive/Doc.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,12 @@ Azure Communication Services (ACS) integrate PowerPoint Live into Teams meetings

## Overview

Azure Communication Services UI Library is adding support for receiving PowerPoint Live video stream.

PowerPoint Live in Teams gives both the presenter and audience an inclusive and engaging experience, combining the best parts of presenting in PowerPoint with the connection and collaboration of a Microsoft Teams meeting.
ACS UI Library is now able to receive such streams, allowing for better Spresentations and more personalized experiences.
Enabling this feature will enable greater productivity and allow for a better experience for ACS based participants in interop calls.

In the ACS UI library, users participate as attendees to view PowerPoint Live sessions. The interaction capabilities for attendees are set and controlled by the lead presenter of the session.

For a detailed breakdown of the capabilities available to each role during a PowerPoint Live session, please refer to the [PowerPoint Live documentation](https://support.microsoft.com/en-us/office/present-from-powerpoint-live-in-microsoft-teams-28b20e74-7165-499c-9bd4-0ad975d448ad). For more technical details on how to implement these features using the web calling SDK, consult the [Calling SDK documentation](https://learn.microsoft.com/en-us/azure/communication-services/concepts/voice-video-calling/calling-sdk-features).
Expand All @@ -27,6 +33,16 @@ Some of the best use cases for PowerPoint Live are:
<sup>Figure: PowerPoint Live interface comparison. Left: Teams app. Right: ACS UI library.</sup>
</sub>

## Incorporating PowerPoint Live into your experience:

The UI Library enables users to enhance their video conferencing experiences by providing the receive PowerPoint live video stream by default within the CallComposite and CallWithChat experiences.

Important notes:

Currently, this feature is only supports receiving a non interactive PowerPoint Live feed. ACS users will not be able to interact with the video stream in the same manner a user on the Teams client will able to, nor will ACS users be able to send a PowerPoint Live feed.

This feature is only available for ACS/Teams Interop calls

## Implementing with the VideoGallery

The VideoGallery component is used to display PowerPoint Live sessions. This multi-functional component also supports video streams and screen sharing. Here’s a brief guide to integrating it:
Expand All @@ -50,3 +66,15 @@ No, our support is limited to Teams meetings where the presenter can share the P
### If I have found any issues or anything missing with the PowerPoint live in the CallComposite where can I go?

Please log a [Github issue](https://github.com/Azure/communication-ui-library/issues), and someone from our team can work with you.

## Tips to Investigate cases:

- For pptlive initialization: Web Calling SDK has telemetry with initialize, success, failure.

- ACS takes this first responsibility when the issue is raised.

- Use callId to Locate the Call: Begin the debugging process by identifying the call with issues, using the callId. This ID is your entry point for tracing the problem.

- Examine contentId for Specific Content Issues: Once the call is located, look for the contentId. The contentId is a unique identifier for a sharing session within that call, pinpointing the content that is causing problems.

- Check featureDetails_step for Detailed Insights: Investigate the featureDetails_step to uncover detailed information about the steps taken during the PPTLive session. This can include both the operational steps Contoso performed and any error steps that occurred.
17 changes: 17 additions & 0 deletions packages/storybook8/stories/Concepts/RaiseHands/Doc.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
import { Meta } from '@storybook/addon-docs';

<Meta title="Concepts/Raise Hands" />

# Raise Hands

Azure Communication Services UI Library is adding support for Raised Hands.

With this new feature, users can now raise and lower their hand during a meeting or call. The user can also see the raised hands of other users to know if someone else is looking to speak. Raised hands are displayed prominently in the video gallery but can also be seen on the participant list to be informed of everyone raising their hands on a call. Enabling this feature will enable greater productivity and allow for a better flow of discussions and conversations within calls. Here are some example scenarios where raised hands are useful:

During a meeting with many participants, the raised hands feature can help the host to manage the conversation and ensure that everyone has a chance to speak. Participants can raise their hands to indicate that they would like to speak, and the host can call on them in order.

In a call setting, the raised hands feature can be used to signal that an action has been completed by the members of a call. For example, participants can raise their hands to signal that they have completed reading a document that has been shared. This lets the host know they can move on with the meeting.

## Incorporating Raise Hands into your Experience

The UI Library enables users to enhance their video conferencing experiences by providing the raised hands feature by default within the CallComposite and CallWithChat experiences.
Loading
Loading