From f2f0c4a8bd40cc4368df88cae66735815a475db4 Mon Sep 17 00:00:00 2001 From: Michael Horton <47794384+michaelhortongsa@users.noreply.github.com> Date: Fri, 24 Jan 2025 13:50:03 -0500 Subject: [PATCH] Update links to assets (#991) --- _data/gwaa_faq.yaml | 6 +++--- .../2023-03-28-manage-gwaa-criteria-01.md | 2 +- .../2023-report/2023-appx-a-terms.md | 8 ++++---- .../2023-report/2023-appx-c-methods.md | 12 ++++++------ .../2023-report/2023-appx-d-overview.md | 4 ++-- .../2023-report/2023-findings-cat-high.md | 2 +- .../2023-report/2023-findings-cat-low.md | 2 +- .../2023-report/2023-findings-cat-moderate.md | 2 +- .../2023-report/2023-findings-dimension.md | 2 +- .../2023-report/2023-findings-key-compliance.md | 4 ++-- .../2023-report/2023-findings-program-staff.md | 2 +- .../2023-report/2023-findings-testing-lifecycle.md | 2 +- .../2024-report/2024-appx-b-methods.md | 2 +- .../2024-report/2024-introduction.md | 2 +- 14 files changed, 26 insertions(+), 26 deletions(-) diff --git a/_data/gwaa_faq.yaml b/_data/gwaa_faq.yaml index bc0eb781b..2786b0ce4 100644 --- a/_data/gwaa_faq.yaml +++ b/_data/gwaa_faq.yaml @@ -59,7 +59,7 @@
  • Five questions have been removed altogether: one from Training, one from Content Creation, one from Policies, Procedures, and Practices, a question in Conformance Metrics regarding a public feedback mechanism, and the criteria regarding defects for enterprise-wide printers due to because of duplicative data, incorrectly scoped questions, or poor data.
  • - For a complete list of changes, please reference this crosswalk between FY23 criteria to FY24 criteria. + For a complete list of changes, please reference this crosswalk between FY23 criteria to FY24 criteria (XLSX). - id: 104 h: Overview @@ -153,7 +153,7 @@ h: Submitting Data q: Is there a way for me to gather all my answers offline? a: | - While reporting entities may use the following Assessment spreadsheet (XLSX) to help collect the information, responses must be submitted using the reporting tool. You may also simply print the criteria questions directly from Assessment criteria pages. + While reporting entities may use the following Assessment spreadsheet (XLSX) to help collect the information, responses must be submitted using the reporting tool. You may also simply print the criteria questions directly from Assessment criteria pages. - id: 202 h: Submitting Data @@ -191,7 +191,7 @@ h: Submitting Data q: What is the best way to navigate the reporting tool? a: | - We recommend gathering all responses prior to accessing and entering data into the reporting tool. This will ease your reporting entity’s ability to start at the beginning of the reporting tool and answer questions in order. We encourage reporting entities to use the Assessment spreadsheet (XLSX) to gather responses. + We recommend gathering all responses prior to accessing and entering data into the reporting tool. This will ease your reporting entity’s ability to start at the beginning of the reporting tool and answer questions in order. We encourage reporting entities to use the Assessment spreadsheet (XLSX) to gather responses. Within this submission response tool, your data is saved automatically—provided you use the same device and same browser—and you may return to the submission tool to continue entering information as many times as you need. You may navigate to different sections of the Assessment Criteria using the Table of Contents, Previous button or Next button. However, once you enter a section, you must answer all required criteria on the screen before moving to another section using the Table of Contents. You can use the Previous button to navigate backwards without answering all required questions on screen, but you cannot use the "Next" button within a section to skip questions without inputting an answer. - diff --git a/_pages/manage/annual-assessment/2023-03-28-manage-gwaa-criteria-01.md b/_pages/manage/annual-assessment/2023-03-28-manage-gwaa-criteria-01.md index eb6c02a13..8cd0fc0f1 100644 --- a/_pages/manage/annual-assessment/2023-03-28-manage-gwaa-criteria-01.md +++ b/_pages/manage/annual-assessment/2023-03-28-manage-gwaa-criteria-01.md @@ -21,7 +21,7 @@ Below is a list of the criteria reporting entities must use to evaluate the exte These questions are broken down into eleven different criteria; one focuses on general information, nine focus on maturity, and one focuses on Section 508 conformance metrics. This is the final list of the questions that your reporting entity will need to submit responses on by July 31,2024. -Reporting entity POCs will receive a link via email to the reporting tool no later than June 1st to submit the information. Feel free to use the following spreadsheet (XLSX) to help you collect the information; however, responses must be submitted using the reporting tool. You may also simply print the criteria questions directly from this page as a guide. +Reporting entity POCs will receive a link via email to the reporting tool no later than June 1st to submit the information. Feel free to use the following spreadsheet (XLSX) to help you collect the information; however, responses must be submitted using the reporting tool. You may also simply print the criteria questions directly from this page as a guide.

    General Information

    Questions in this section ask about information and metrics related to the activities of your reporting entity's Section 508 Program (or equivalent) activities.

    diff --git a/_pages/manage/annual-assessment/2023-report/2023-appx-a-terms.md b/_pages/manage/annual-assessment/2023-report/2023-appx-a-terms.md index 7902923c1..5996fe62f 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-appx-a-terms.md +++ b/_pages/manage/annual-assessment/2023-report/2023-appx-a-terms.md @@ -53,11 +53,11 @@ The following terms are used in this 2023 Assessment report to Congress. Additio Business Function Maturity - Or simply “maturity” as it is referenced throughout this document, approximates the level of development and advancement of a reporting entity’s Section 508 Program as well as other functions across the organization (the business) that are relevant to digital accessibility. It assesses the reporting entity’s responses to the criteria within 9 maturity dimensions (Q22-Q61, excepting Q27B (See FY23 Data Dictionary (XLSX))). These questions cover different aspects of their Section 508 Program or equivalent as well as accessibility-related activities across the organization. Business function maturity, or “maturity” is evaluated by the maturity index or “m-index” and utilizes a scale ranging from 0 to 5, with 0 indicating a very low maturity level, and 5 indicating a very high maturity level. + Or simply “maturity” as it is referenced throughout this document, approximates the level of development and advancement of a reporting entity’s Section 508 Program as well as other functions across the organization (the business) that are relevant to digital accessibility. It assesses the reporting entity’s responses to the criteria within 9 maturity dimensions (Q22-Q61, excepting Q27B (See FY23 Data Dictionary (XLSX)). These questions cover different aspects of their Section 508 Program or equivalent as well as accessibility-related activities across the organization. Business function maturity, or “maturity” is evaluated by the maturity index or “m-index” and utilizes a scale ranging from 0 to 5, with 0 indicating a very low maturity level, and 5 indicating a very high maturity level. c-index - Assesses reporting entity conformance related to approximation of conformance to Section 508 standards. It measures how well reporting entities meet specific criteria within the Conformance dimension, using numerical values and weights for 16 criteria, including Q61, Q79, Q71, Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). The index uses a scale from 0 to 5, with 0 representing very low and 5 representing very high conformance levels. + Assesses reporting entity conformance related to approximation of conformance to Section 508 standards. It measures how well reporting entities meet specific criteria within the Conformance dimension, using numerical values and weights for 16 criteria, including Q61, Q79, Q71, Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). The index uses a scale from 0 to 5, with 0 representing very low and 5 representing very high conformance levels. Category @@ -80,7 +80,7 @@ The following terms are used in this 2023 Assessment report to Congress. Additio Criteria - Refers to the 105 Assessment criteria (See FY23 Data Dictionary (XLSX)) that reporting entities responded to, split into 11 dimensions. Due to dependencies, some reporting entities may have responded to fewer than 105 criteria. Also denoted with “Q” before a number to identify specific criteria referenced. The terms criteria, metric, and question are used interchangeably when referring to the content reporting entities responded to. + Refers to the 105 Assessment criteria (See FY23 Data Dictionary (XLSX)) that reporting entities responded to, split into 11 dimensions. Due to dependencies, some reporting entities may have responded to fewer than 105 criteria. Also denoted with “Q” before a number to identify specific criteria referenced. The terms criteria, metric, and question are used interchangeably when referring to the content reporting entities responded to. Dimension @@ -132,7 +132,7 @@ The following terms are used in this 2023 Assessment report to Congress. Additio Operational Conformance - Or simply referenced as “conformance” throughout this document. Operational conformance approximates how effectively reporting entities adhere to the relevant requirements for Section 508. Operational conformance is measured by the conformance index or “c-index,” which assesses a reporting entity conformance by quantifying their responses to 16 specific criteria within the Conformance dimension. These criteria include Q61, Q79, Q71, Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). Each of these criteria is assigned numerical values and weights to determine the overall conformance score. + Or simply referenced as “conformance” throughout this document. Operational conformance approximates how effectively reporting entities adhere to the relevant requirements for Section 508. Operational conformance is measured by the conformance index or “c-index,” which assesses a reporting entity conformance by quantifying their responses to 16 specific criteria within the Conformance dimension. These criteria include Q61, Q79, Q71, Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). Each of these criteria is assigned numerical values and weights to determine the overall conformance score. Overall performance diff --git a/_pages/manage/annual-assessment/2023-report/2023-appx-c-methods.md b/_pages/manage/annual-assessment/2023-report/2023-appx-c-methods.md index 212838e74..39b59d8ce 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-appx-c-methods.md +++ b/_pages/manage/annual-assessment/2023-report/2023-appx-c-methods.md @@ -22,7 +22,7 @@ The methods outlined in this section are the result of detailed discussions and ## Development and Dissemination of Assessment Criteria OMB and GSA collaborated with Access Board and OSTP to refine the original pool of 150 criteria into a final, more targeted set of 105 Assessment criteria across 11 dimensions. See [Table C1](#table-c1) for a description of the 11 Assessment dimensions. We collectively determined the selected criteria best represented the current state of Section 508 compliance across the federal government and, through the establishment of a new baseline, presented an opportunity to analyze trends in future years. While some criteria were informed by the previous Section 508 Report to the President and Congress: Accessibility of the Federal Electronic and Information Technology, none were lifted verbatim. To help ascertain Section 508 Program maturity based on typical reporting entity practices, the team consulted four maturity models for the nine middle dimensions (i.e., those excluding General Information and Conformance Metrics): -* [Intelligence Community IT Accessibility Program Maturity Model (PPTX)](https://assets.section508.gov/files/presentations/iaaf/Accessibility%20Maturity%20Models%20-%20IAAF%202022.pptx) +* [Intelligence Community IT Accessibility Program Maturity Model (PPTX)](https://training.section508.gov/assets/files/iaaf/Accessibility%20Maturity%20Models%20-%20IAAF%202022.pptx) * World Wide Web Consortium (W3C) Web Accessibility Initiative (WAI) W3C Accessibility Maturity Model @@ -97,7 +97,7 @@ GSA developed a script to systematically validate reporting entity data by crite ## Descriptive Analysis We conducted a descriptive study of the data, akin to an inventory or initial exploration, to provide a holistic view of reporting entity data and determine key patterns and trends. We examined averages, frequency distributions, and other essential statistical parameters for each criteria, paying close attention to core areas that directly tie into our research questions. Two areas, reporting entity “business function maturity” and “operational conformance” (i.e., reporting entity conformance to the applicable requirements in the ICT Standards and Guidelines), emerged as key perspectives and points of interest during our discussions. -First, we created an index to assess reporting entity business function maturity (m-index). This index quantified reporting entity responses to criteria across 9 dimensions: IT Accessibility Program Office; Policies, Procedures, and Standards; Communications; Content Creation; Human Capital, Culture, and Leadership; Technology Lifecycle Activities; Testing and Validation; Acquisition and Procurement; and Training. The m-index encompassed Questions 22 to 61 except Q27B (See FY23 Data Dictionary (XLSX)), and all were multiple choice format, equally weighted, and scored as follows: +First, we created an index to assess reporting entity business function maturity (m-index). This index quantified reporting entity responses to criteria across 9 dimensions: IT Accessibility Program Office; Policies, Procedures, and Standards; Communications; Content Creation; Human Capital, Culture, and Leadership; Technology Lifecycle Activities; Testing and Validation; Acquisition and Procurement; and Training. The m-index encompassed Questions 22 to 61 except Q27B (See FY23 Data Dictionary (XLSX)), and all were multiple choice format, equally weighted, and scored as follows: * a) = 0; signifying very low @@ -110,12 +110,12 @@ First, we created an index to assess reporting entity business function maturity * e) = 4; signifying very high -Furthermore, a selection of “Unknown” received a 0, and a selection of “Not applicable or N/A” received a 4. For a few criteria (Q24, Q30, and Q44) (See FY23 Data Dictionary (XLSX)), (f) = 4 also signifies Very High. We considered that argument and understood that scoring N/A as a “4” could inflate a reporting entity score for a dimension, but we nonetheless chose this so that all reporting entities had an equal number of questions to score (the denominator would be the same for each reporting entity) and no reporting entity was penalized with a low score for N/A (i.e., things that do not apply to them). +Furthermore, a selection of “Unknown” received a 0, and a selection of “Not applicable or N/A” received a 4. For a few criteria (Q24, Q30, and Q44) (See FY23 Data Dictionary (XLSX)), (f) = 4 also signifies Very High. We considered that argument and understood that scoring N/A as a “4” could inflate a reporting entity score for a dimension, but we nonetheless chose this so that all reporting entities had an equal number of questions to score (the denominator would be the same for each reporting entity) and no reporting entity was penalized with a low score for N/A (i.e., things that do not apply to them). -Second, we created an operational conformance (referred to as “conformance” or “c-index”) index to assess how well reporting entities performed per Section 508 requirements. Thus, this index quantified select reporting entity responses to 16 specific criteria in the Conformance section of criteria that directly relate to quantifiable compliance outcomes and included: Q61, Q79, Q71, Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). They were assigned numerical values and weighted as shown in [Table C2](#table-c2). +Second, we created an operational conformance (referred to as “conformance” or “c-index”) index to assess how well reporting entities performed per Section 508 requirements. Thus, this index quantified select reporting entity responses to 16 specific criteria in the Conformance section of criteria that directly relate to quantifiable compliance outcomes and included: Q61, Q79, Q71, Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). They were assigned numerical values and weighted as shown in [Table C2](#table-c2). - + @@ -197,7 +197,7 @@ Regression analysis describes a way to understand how different variables relate * Centers for Medicare and Medicaid (CMS) report to Congress, "Medicare Home Health Study: An Investigation on Access to Care and Payment for Vulnerable Patient Populations"  -Multivariable linear regression can explain how two or more independent variables affect a single dependent variable. In the Assessment, we conducted simple linear and multivariable linear regressions to understand how different variables within a reporting entity relate to one another to influence that reporting entity’s Section 508 compliance. Variables corresponded directly from criteria (See FY23 Data Dictionary (XLSX)), and included but were not limited to the following: +Multivariable linear regression can explain how two or more independent variables affect a single dependent variable. In the Assessment, we conducted simple linear and multivariable linear regressions to understand how different variables within a reporting entity relate to one another to influence that reporting entity’s Section 508 compliance. Variables corresponded directly from criteria (See FY23 Data Dictionary (XLSX)), and included but were not limited to the following: * Hours a Section 508 PM spends working on the reporting entity’s Section 508 Program every week (Q3) diff --git a/_pages/manage/annual-assessment/2023-report/2023-appx-d-overview.md b/_pages/manage/annual-assessment/2023-report/2023-appx-d-overview.md index ad812dcc3..572b41e44 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-appx-d-overview.md +++ b/_pages/manage/annual-assessment/2023-report/2023-appx-d-overview.md @@ -62,9 +62,9 @@ Comprehensive submission data by reporting entity can be found at: [Section508.g * Identification of parent agency (if applicable) -* Maturity Bracket: This measure of a reporting entity’s Section 508 Program maturity assesses responses to criteria across 9 dimensions, encompassed Questions 22 to 61 except Q27B (See FY23 Data Dictionary (XLSX)), and all were multiple choice format, equally weighted, and scored as noted in Methods, Descriptive Analysis. This maturity bracket consists of an index using a scale from 0 to 5, with 0 representing very low and 5 representing very high maturity levels. +* Maturity Bracket: This measure of a reporting entity’s Section 508 Program maturity assesses responses to criteria across 9 dimensions, encompassed Questions 22 to 61 except Q27B (See FY23 Data Dictionary (XLSX)), and all were multiple choice format, equally weighted, and scored as noted in Methods, Descriptive Analysis. This maturity bracket consists of an index using a scale from 0 to 5, with 0 representing very low and 5 representing very high maturity levels. -* Conformance Bracket: This measure of a reporting entity’s Conformance Metrics consists of an index using a scale from 0 to 5, with 0 representing very low and 5 representing very high conformance. This index quantified select responses to 16 specific criteria in the Conformance section of criteria which directly relate to quantifiable compliance outcomes and included: Q61, Q79, Q71,Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). They were assigned numerical values and weighted as shown in Table C2. +* Conformance Bracket: This measure of a reporting entity’s Conformance Metrics consists of an index using a scale from 0 to 5, with 0 representing very low and 5 representing very high conformance. This index quantified select responses to 16 specific criteria in the Conformance section of criteria which directly relate to quantifiable compliance outcomes and included: Q61, Q79, Q71,Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). They were assigned numerical values and weighted as shown in Table C2. * Recommendations specific to the reporting entity bracket linked to Maturity-Conformance category (within Findings). diff --git a/_pages/manage/annual-assessment/2023-report/2023-findings-cat-high.md b/_pages/manage/annual-assessment/2023-report/2023-findings-cat-high.md index c34e8611b..9cdc02576 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-findings-cat-high.md +++ b/_pages/manage/annual-assessment/2023-report/2023-findings-cat-high.md @@ -37,7 +37,7 @@ format: HTML (html) * All reporting entities in this category test internet pages, but all reported very low conformance rates (22% or less). -* All 4 reporting entities responded that they lacked the resources to test the digital content requested by the Assessment (Q78-81) (See FY23 Data Dictionary (XLSX)). +* All 4 reporting entities responded that they lacked the resources to test the digital content requested by the Assessment (Q78-81) (See FY23 Data Dictionary (XLSX)). ### Overall Recommendations All reporting entities in the High-Very Low category performed better than expected in Communications, Content Creation, and IT Accessibility Program activities. Reporting entities in this category should focus on developer capabilities and allocating proper resources for Section 508 testing . Reporting entities should also make incremental improvements across all dimensions with particular focus on Testing and Technology Lifecycle Activities. diff --git a/_pages/manage/annual-assessment/2023-report/2023-findings-cat-low.md b/_pages/manage/annual-assessment/2023-report/2023-findings-cat-low.md index 7a35ab260..651b5d815 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-findings-cat-low.md +++ b/_pages/manage/annual-assessment/2023-report/2023-findings-cat-low.md @@ -142,7 +142,7 @@ Reporting entities in the Low-High category had several maturity dimensions high * Both reporting entities in this category said Section 508 conformance is regularly included as a technical evaluation factor in solicitations. -* Both reporting entities in this category do not have a Section 508 training plan or the capacity to provide Section 508 training at all (Q58 and Q60 (See FY23 Data Dictionary (XLSX))). +* Both reporting entities in this category do not have a Section 508 training plan or the capacity to provide Section 508 training at all (Q58 and Q60 (See FY23 Data Dictionary (XLSX))). ## Overall Recommendations: Reporting entities in this Low-Very High category had a few standout dimensions with a few below- expected scores . Reporting entities in this category should look to improve their accessibility communications processes. Additionally, although their actual conformance scores were relatively high, their testing maturity was relatively low . We also recommend they invest in their testing infrastructure and processes to ensure repeatability of conformance testing, robustness of testing processes, and overall Section 508 conformance. diff --git a/_pages/manage/annual-assessment/2023-report/2023-findings-cat-moderate.md b/_pages/manage/annual-assessment/2023-report/2023-findings-cat-moderate.md index 1eabbb746..630322665 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-findings-cat-moderate.md +++ b/_pages/manage/annual-assessment/2023-report/2023-findings-cat-moderate.md @@ -118,7 +118,7 @@ Reporting entities in the Moderate-Moderate category performed as expected with * All reporting entities in this category test internet pages, with all reporting over 80% of pages fully conformant. -* 7 of 8 reporting entities in this category (88%) provided results for the digital content conformance questions (Q78-81 (See FY23 Data Dictionary (XLSX))), with relatively high conformance on average. +* 7 of 8 reporting entities in this category (88%) provided results for the digital content conformance questions (Q78-81 (See FY23 Data Dictionary (XLSX))), with relatively high conformance on average. ### Overall Recommendations: Reporting entities in the Moderate-High category performed as expected on trend in most dimensions, but underperformed expectations in Acquisitions and Procurement. Reporting entities in this category should focus on improving the maturity of their training and human capital considerations . Additionally, a concentration on improving maturity in general is suggested as conformance across reporting entities in this category was relatively high. diff --git a/_pages/manage/annual-assessment/2023-report/2023-findings-dimension.md b/_pages/manage/annual-assessment/2023-report/2023-findings-dimension.md index 8cc92f2d9..6ea539b61 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-findings-dimension.md +++ b/_pages/manage/annual-assessment/2023-report/2023-findings-dimension.md @@ -17,7 +17,7 @@ format: "HTML (html)" --- ### Dimensional Highlights -Maturity questions ranged from Q22 to Q61 (excepting Q27B) (See FY23 Data Dictionary (XLSX)). These questions are grouped into the following nine maturity dimensions: IT Accessibility Program Office; Policies, Procedures, and Standards; Communications; Content Creation; Human Capital, Culture, and Leadership; Technology Lifecycle Activities, Testing and Validation; Acquisition and Procurement; and Training. Below are the overall outcomes for governmentwide maturity: +Maturity questions ranged from Q22 to Q61 (excepting Q27B) (See FY23 Data Dictionary (XLSX)). These questions are grouped into the following nine maturity dimensions: IT Accessibility Program Office; Policies, Procedures, and Standards; Communications; Content Creation; Human Capital, Culture, and Leadership; Technology Lifecycle Activities, Testing and Validation; Acquisition and Procurement; and Training. Below are the overall outcomes for governmentwide maturity:
    diff --git a/_pages/manage/annual-assessment/2023-report/2023-findings-key-compliance.md b/_pages/manage/annual-assessment/2023-report/2023-findings-key-compliance.md index 439557099..e4a0de789 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-findings-key-compliance.md +++ b/_pages/manage/annual-assessment/2023-report/2023-findings-key-compliance.md @@ -383,8 +383,8 @@ this section as “maturity”) and the conformance index (referred to throughou
      -
    1. The nine Maturity dimensions are: IT Accessibility Program Office; Policies, Procedures, and Standards; Communications; Content Creation; Human Capital, Culture, and Leadership; Technology Lifecycle Activities, Testing and Validation; Acquisition and Procurement; and Training. This included responses from questions 22 through 61, excepting Q27B (See FY23 Data Dictionary (XLSX)).
    2. -
    3. Criteria are Q61, Q79, Q71,Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). Numerical values and weights were assigned as shown in Table C2.
    4. +
    5. The nine Maturity dimensions are: IT Accessibility Program Office; Policies, Procedures, and Standards; Communications; Content Creation; Human Capital, Culture, and Leadership; Technology Lifecycle Activities, Testing and Validation; Acquisition and Procurement; and Training. This included responses from questions 22 through 61, excepting Q27B (See FY23 Data Dictionary (XLSX)).
    6. +
    7. Criteria are Q61, Q79, Q71,Q78, Q80 to Q85, and Q87 to Q92 (See FY23 Data Dictionary (XLSX)). Numerical values and weights were assigned as shown in Table C2.
    diff --git a/_pages/manage/annual-assessment/2023-report/2023-findings-program-staff.md b/_pages/manage/annual-assessment/2023-report/2023-findings-program-staff.md index 4ad1c6fc6..0d11386c3 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-findings-program-staff.md +++ b/_pages/manage/annual-assessment/2023-report/2023-findings-program-staff.md @@ -44,7 +44,7 @@ format: "HTML (html)"
    -

    In addition, regression analysis showed that the number of hours per week a Section 508 PM supported a Section 508 Program (Q3 (See FY23 Data Dictionary (XLSX))) predicted the status of the Section 508 Program (Q22). For every extra hour spent, the likelihood of a more mature Section 508 Program status increased by 0.048.15 This shows the more time Section 508 PMs invest results in a higher maturity of the Section 508 Program. Similarly, the number of hours per week a Section 508 PM supported a Section 508 Program (Q3) predicted the amount of resources and staffing within the Section 508 Program (Q26), highlighting the positive impact of increased dedication to Section 508 Program management on resource allocation within federal reporting entities. Every additional hour the Section 508 PM spent translated into a 0.023 governmentwide increase in resources and staffing.16

    +

    In addition, regression analysis showed that the number of hours per week a Section 508 PM supported a Section 508 Program (Q3 (See FY23 Data Dictionary (XLSX))) predicted the status of the Section 508 Program (Q22). For every extra hour spent, the likelihood of a more mature Section 508 Program status increased by 0.048.15 This shows the more time Section 508 PMs invest results in a higher maturity of the Section 508 Program. Similarly, the number of hours per week a Section 508 PM supported a Section 508 Program (Q3) predicted the amount of resources and staffing within the Section 508 Program (Q26), highlighting the positive impact of increased dedication to Section 508 Program management on resource allocation within federal reporting entities. Every additional hour the Section 508 PM spent translated into a 0.023 governmentwide increase in resources and staffing.16

    Table 4 depicts the average hours per week Section 508 PMs dedicate to their Section 508 Program by maturity bracket. This shows that as maturity increases, so does the average number of hours per week a Section 508 PM dedicates to the Section 508 Program; or vice versa, the more time a Section 508 PM dedicates to their Program, the more mature the Program is. diff --git a/_pages/manage/annual-assessment/2023-report/2023-findings-testing-lifecycle.md b/_pages/manage/annual-assessment/2023-report/2023-findings-testing-lifecycle.md index dc81ac535..26d7b11d6 100644 --- a/_pages/manage/annual-assessment/2023-report/2023-findings-testing-lifecycle.md +++ b/_pages/manage/annual-assessment/2023-report/2023-findings-testing-lifecycle.md @@ -151,7 +151,7 @@ The Assessment included several questions specifically related to electronic doc

    Furthermore, equivalent regression analysis replacing electronic document conformance with intranet, public internet, and video conformance did not stand out as statistically meaningful. The reasons behind this difference remain uncertain. Improved data quality or year-to-year analysis may shed more light on this matter. For now, regression suggests electronic document conformance serves as a better indicator of Section 508 Program maturity.

    Regression analysis investigated the relationships between reporting entity size, as provided by publicly available datasets from Fedscope OPM, and Section 508 conformance of intranet web pages, public internet web pages, public electronic documents, and videos (Q61, Q71, Q78, Q79, Q80, Q81).23 The results consistently showed reporting entity size, on its own, does not have a meaningful impact on ICT conformance. While size itself may not be a good indicator, reporting entities with a strong department-level or parent agency that offers resources to component reporting entities may lead to higher conformance. Additionally, reporting entities that have a parent-component dynamic have implications for size, and we expect the department as a whole is relatively large while components individually are much smaller. For FY23, the criteria did not include tailored questions to pinpoint reporting entities who utilize parent-agency level resources in order to determine any correlation. We intend to hone questions in FY24 to find correlations between parent and component reporting entities.

    -

    (See FY23 Data Dictionary (XLSX))

    +

    (See FY23 Data Dictionary (XLSX))

    ### Non-Conformance Tracking and Remediation diff --git a/_pages/manage/annual-assessment/2024-report/2024-appx-b-methods.md b/_pages/manage/annual-assessment/2024-report/2024-appx-b-methods.md index abc0166f2..b84c20c3e 100644 --- a/_pages/manage/annual-assessment/2024-report/2024-appx-b-methods.md +++ b/_pages/manage/annual-assessment/2024-report/2024-appx-b-methods.md @@ -21,7 +21,7 @@ GSA, OMB, and the U.S. Access Board (Access Board) built upon the groundwork lai ## Development and Dissemination of Assessment Criteria -To better evaluate the current state of Section 508 compliance and digital accessibility across the federal government, GSA and OMB, in collaboration with the Access Board and OSTP, refined the FY23 Assessment criteria language for FY24 with a focus on making questions and response options easier to interpret. For example, we used the term "reporting entity" in place of  "agency" to encompass both agencies, i.e., bureaus, departments, and headquarters,  and components, i.e., organizational units that reside within a department or large agency. Additionally, we added frequency percentages for never, sometimes, regularly, frequently, and almost always directly to response options to enhance clarity. We introduced several new questions, covering topics such as total federal employees, ICT test processes utilized, and exceptions processes. We also introduced 10 new questions---questions 80 to 89---on a rotating basis to broaden the scope of inquiries regarding ICT. We removed five questions due to data quality issues or redundancies and significantly revised answer choices for the following criteria: questions 30 to 33, 36 specifically answer choice d), 39 to 42, 53 to 57, 60, 62 to 63, and 65. All 103 questions were mandatory for FY24, some with dependencies. For a complete list of Assessment criteria changes, please reference this crosswalk between FY23 criteria to FY24 criteria. +To better evaluate the current state of Section 508 compliance and digital accessibility across the federal government, GSA and OMB, in collaboration with the Access Board and OSTP, refined the FY23 Assessment criteria language for FY24 with a focus on making questions and response options easier to interpret. For example, we used the term "reporting entity" in place of  "agency" to encompass both agencies, i.e., bureaus, departments, and headquarters,  and components, i.e., organizational units that reside within a department or large agency. Additionally, we added frequency percentages for never, sometimes, regularly, frequently, and almost always directly to response options to enhance clarity. We introduced several new questions, covering topics such as total federal employees, ICT test processes utilized, and exceptions processes. We also introduced 10 new questions---questions 80 to 89---on a rotating basis to broaden the scope of inquiries regarding ICT. We removed five questions due to data quality issues or redundancies and significantly revised answer choices for the following criteria: questions 30 to 33, 36 specifically answer choice d), 39 to 42, 53 to 57, 60, 62 to 63, and 65. All 103 questions were mandatory for FY24, some with dependencies. For a complete list of Assessment criteria changes, please reference this crosswalk between FY23 criteria to FY24 criteria (XLSX). While the criteria underwent minor structural changes, their major organizing framework remained intact. Please see the subsection in [Appendix C: Methods]({{site.baseurl}}/manage/section-508-assessment/2023/appendix-c-methods/) from the FY23 Assessment for more information on how we developed the original Assessment criteria. The 11 dimensions that categorize the criteria remained unchanged from FY23. Table B1 describes each of the 11 dimensions. diff --git a/_pages/manage/annual-assessment/2024-report/2024-introduction.md b/_pages/manage/annual-assessment/2024-report/2024-introduction.md index e17f49870..9fa3064bb 100644 --- a/_pages/manage/annual-assessment/2024-report/2024-introduction.md +++ b/_pages/manage/annual-assessment/2024-report/2024-introduction.md @@ -50,7 +50,7 @@ The makeup of all [245 reporting entities]({{site.baseurl}}/manage/section-508-a **Please note that the terms "reporting entity" and "respondent" used in this Assessment are not synonymous with "agencies" as used in Section 508 of the Rehabilitation Act or M-24-08. Thus, not all respondents may be subject to Section 508 implementation requirements or to M-24-08 guidance.**  -OMB, in close coordination with GSA, the Access Board, the Office of Science and Technology Policy (OSTP), and the Department of Justice (DOJ), refined the assessment criteria from the previous year, incorporating feedback from agencies to thoroughly and efficiently evaluate reporting entity Section 508 compliance. For a complete list of changes, please reference the crosswalk between FY23 and FY24 criteria. GSA also refined additional context for understanding the criteria, Frequently Asked Questions (FAQs), and defining terms to help reporting entities collect accurate data for the Assessment. +OMB, in close coordination with GSA, the Access Board, the Office of Science and Technology Policy (OSTP), and the Department of Justice (DOJ), refined the assessment criteria from the previous year, incorporating feedback from agencies to thoroughly and efficiently evaluate reporting entity Section 508 compliance. For a complete list of changes, please reference the crosswalk between FY23 and FY24 criteria (XLSX). GSA also refined additional context for understanding the criteria, Frequently Asked Questions (FAQs), and defining terms to help reporting entities collect accurate data for the Assessment. OMB broadly distributed instructions and 103 assessment criteria to heads of agencies, agency Chief Information Officers (CIO), and Section 508 Program Managers (PM) on April 8, 2024. Additionally, GSA posted the instructions and criteria (XLSX) on Section508.gov the same day. 
    Table C2. Topics, Conversion Approaches, and Weights of Conformance Criteria (See FY23 Data Dictionary (XLSX))Table C2. Topics, Conversion Approaches, and Weights of Conformance Criteria (See FY23 Data Dictionary (XLSX))
    Topic